• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 258
  • 98
  • 21
  • 16
  • 11
  • 9
  • 9
  • 9
  • 8
  • 6
  • 5
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 527
  • 527
  • 91
  • 78
  • 77
  • 67
  • 65
  • 57
  • 55
  • 54
  • 51
  • 38
  • 37
  • 36
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

The use of a radio frequency tracking system to quantify the external demands of elite wheelchair rugby

Rhodes, James M. January 2015 (has links)
Within team sports, coaches aim to improve physical preparation by optimising the training process specific to competition. Unfortunately, at the elite level of wheelchair rugby (WCR) evidence-based information to guide this process is currently lacking. The present thesis investigates measures of external load during elite competition and explores whether this can be translated to inform current training practices. The first study established the suitability of a radio frequency-based, indoor tracking system (ITS) for the collection of movements specific to WCR. Minimal relative distance errors (< 0.2%) were seen across different sampling frequencies. Peak speed displayed the greatest relative error in 4 Hz tags (2%), with significantly lower errors observed in higher frequency tags (< 1%). The ITS was therefore deemed an acceptable tool for quantifying external load specific to WCR using a sampling frequency of 8 or 16 Hz. The external demands of elite competition were determined in Chapters 4 and 5. Notable differences in the volume of activity were displayed across the International Wheelchair Rugby Federation (IWRF) classification groups. However, the specific positional requirements of low-point (LP) and high-point players (HP) appeared to influence the intensity of external load (Chapter 4). Chapter 5 extended this work and established that peak speed and the ability to perform at high-intensities were best associated with successful mobility performance in WCR, as defined by team rank. This was further shown to be roledependent, whereby high-ranked HP players achieved greater peak speeds and performed more high-intensity activities (HIA) than respective lower-ranked players. Comparisons between the current external demands of training were then compared to that of competition (Chapter 6). Conditioning drills were shown to exceed the demands of competition, irrespective of classification. Notable differences in skill-based and game related drills were displayed across player classifications, whereby both were shown to be roledependent. Although game-simulation drills provided the best representation of competition, the duration appeared important since this factor influenced the results (Chapter 6). When the format of these drills were further modified (Chapter 7), drills containing fewer players increased the volume and intensity of training, specifically in HP. Whilst a 30-second shotclock elicited no changes in external load, differences were revealed when the shot-clock was further reduced to 15-s. Coaches can therefore modify the external training response by making subtle changes to the format of game-simulation drills. This thesis revealed that functional classification and positional-role are key factors during competition, and training should therefore be structured with this in mind. Conditioning drills can be used to elicit a progressive overload in the external responses, whilst game-simulation drills can provide the best representation of competition. Given the importance of gamesimulation drills, the combination of different formats throughout training sessions are critical in order to maximise the preparation of elite WCR players.
62

TFPS : um sistema de pré-processamento de traces para auxiliar na visualização de programas paralelos / TFPS - a traces preprocessing system to aid in parallel programs visualization

Stringhini, Denise January 1997 (has links)
O trabalho apresenta o projeto e o desenvolvimento de uma ferramenta para visualização lógica da execução de programas paralelos, a TFPS de Trace File Preprocessor System, cujo objetivo é a analise de desempenho de tais programas. 0 projeto é baseado no pré-processamento de arquivos de traces de execução dos programas. A idéia básica consiste em aproveitar as informações fornecidas pela monitoração. Estas informações, que em geral são utilizadas apenas para dirigir animação post-mortem destes programas, neste caso são utilizadas também na montagem das janelas de visualização. Assim, são descritos o pré-processador e a montagem das janelas de visualização. O primeiro, e responsável principalmente pela leitura e analise das informações contidas no arquivo de trace e pela geração de um arquivo de saída com todas as informações necessárias a montagem das janelas. Estas foram concebidas levando em consideração o tipo de informação que pode ser obtido de um arquivo de trace. Desta forma, foi possível aproximar o conteúdo das janelas de visualização o máximo possível do programa paralelo em analise. Com o objetivo de demonstrar esta aproximação foi construído um protótipo tanto para o pré-processador quanto para a ferramenta de visualização. Ambos os protótipos são descritos neste trabalho. / This study presents the project and development of a logical visualization tool for parallel programs. the TFPS of Trace File Preprocessor System, whose goal is the performance analysis of such programs. The project is based on the preprocessing of trace files of programs' execution. The basic idea consists in making use of the information given by the monitoring process. This information, whose general application is only to drive the post-mortem animation of these programs, is in this case also used to create the visualization displays. Thus, the preprocessor and the creation of visualization displays are described. The first is mainly responsible for reading and analyzing the information present in the trace file and for generating an output file with all information necessary for creating the views. The latter was conceived by taking into consideration the type of information that can be obtained from a trace file. Therefore it was possible to make the content of the visualization displays close to the parallel program that is being analyzed. A prototype of the preprocessor as well as of the visualization tool was built up in order to demonstrate the described approach. Both prototypes are described in this study.
63

Análise da performance dos fundos de investimentos em ações no Brasil / Performance analysis of equity mutual funds in Brazil

Marco Antonio Laes 02 December 2010 (has links)
O objetivo desta dissertação é analisar a performance da indústria de fundos de investimentos em ações no Brasil. Alvo de poucos estudos no mercado nacional, a análise do desempenho da gestão de carteiras se faz cada vez mais importante, dado o avanço, ao longo dos últimos anos, dos fundos de investimentos como destino da poupança privada brasileira. As análises tradicionais, em que é testada individualmente a significância do alfa (intercepto) de regressões dos retornos dos fundos utilizando-se geralmente o CAPM ou o modelo de Fama-French (ou alguma variante destes), sofrem de diversos problemas, como a provável não-normalidade dos erros (73,8% em nossa amostra), e a não-consideração da correlação entre os alfas dos diversos fundos invalidando-se inferências tradicionais. O maior problema desta abordagem, porém, é que se ignora o fato de que, dentro de um universo grande de fundos, espera-se que alguns destes apresentem desempenho superior não por uma gestão diferenciada de suas carteiras, mas por mera sorte. A fim de superar esta dificuldade, o presente estudo, utilizando uma amostra de 812 fundos de ações durante o período 2002-2009 (incluindo-se fundos sobreviventes e não-sobreviventes), simulou a distribuição cross-sectional dos alfas (e de suas respectivas estatística-t) destes fundos através de técnicas de bootstrap, buscando-se com este procedimento eliminar o fator sorte nas análises. Os resultados foram de acordo com a literatura internacional, apresentando evidências da existência de pouquíssimos fundos com performance superior de fato, ao passo que um grande número de fundos apresentou um desempenho negativo, não por azar, mas por real gestão inferior. / The purpose of this dissertation is to examine the performance of the equity mutual funds industry in Brazil. Object of few studies in the national market, the performance analysis of active management has become increasingly more important, given the advance, especially over the last few years, of mutual funds as a destination of the Brazilian private savings. The traditional analysis, where the significance of the alpha (the intercept) from regressions of funds returns is tested individually, using generally the CAPM or the Fama-French model (or some variant of these), suffer from a large array of problems, from the non-normality of errors (73.8% in our sample) to the non-consideration of the correlation between the alphas of the various funds, invalidating the traditional inferences. The biggest problem regarding this approach, however, is that it ignores the fact that, in a large universe of funds, its expected that some funds will present superior performance not from differentiated management, but for mere luck. In order to address these shortcomings, the present study, using an extensive sample of 812 equity mutual funds during the 2002-2009 period (both surviving and non-surviving funds), simulates the cross-sectional distribution of alphas (and its-statistics) through bootstrap techniques, aiming with this procedure to eliminate the luck factor in the analysis. The results were in accordance with the international literature, showing evidences that only a few funds present actual superior performance, and a large number of funds present actual negative performance, not because they were unlucky, but due to inferior management.
64

TFPS : um sistema de pré-processamento de traces para auxiliar na visualização de programas paralelos / TFPS - a traces preprocessing system to aid in parallel programs visualization

Stringhini, Denise January 1997 (has links)
O trabalho apresenta o projeto e o desenvolvimento de uma ferramenta para visualização lógica da execução de programas paralelos, a TFPS de Trace File Preprocessor System, cujo objetivo é a analise de desempenho de tais programas. 0 projeto é baseado no pré-processamento de arquivos de traces de execução dos programas. A idéia básica consiste em aproveitar as informações fornecidas pela monitoração. Estas informações, que em geral são utilizadas apenas para dirigir animação post-mortem destes programas, neste caso são utilizadas também na montagem das janelas de visualização. Assim, são descritos o pré-processador e a montagem das janelas de visualização. O primeiro, e responsável principalmente pela leitura e analise das informações contidas no arquivo de trace e pela geração de um arquivo de saída com todas as informações necessárias a montagem das janelas. Estas foram concebidas levando em consideração o tipo de informação que pode ser obtido de um arquivo de trace. Desta forma, foi possível aproximar o conteúdo das janelas de visualização o máximo possível do programa paralelo em analise. Com o objetivo de demonstrar esta aproximação foi construído um protótipo tanto para o pré-processador quanto para a ferramenta de visualização. Ambos os protótipos são descritos neste trabalho. / This study presents the project and development of a logical visualization tool for parallel programs. the TFPS of Trace File Preprocessor System, whose goal is the performance analysis of such programs. The project is based on the preprocessing of trace files of programs' execution. The basic idea consists in making use of the information given by the monitoring process. This information, whose general application is only to drive the post-mortem animation of these programs, is in this case also used to create the visualization displays. Thus, the preprocessor and the creation of visualization displays are described. The first is mainly responsible for reading and analyzing the information present in the trace file and for generating an output file with all information necessary for creating the views. The latter was conceived by taking into consideration the type of information that can be obtained from a trace file. Therefore it was possible to make the content of the visualization displays close to the parallel program that is being analyzed. A prototype of the preprocessor as well as of the visualization tool was built up in order to demonstrate the described approach. Both prototypes are described in this study.
65

Towards secure web services : performance analysis, decision making and steganography approaches

Alrouh, Bachar January 2011 (has links)
Web services provide a platform neutral and programming language independent technology that supports interoperable machine-to-machine interaction over a network. Clients and other systems interact with Web services using a standardised XML messaging system, such as the Simple Object Access Protocol (SOAP), typically conveyed using HTTP with an XML serialisation in conjunction with other related Web standards. Nevertheless, the idea of applications from different parties communicating together raises a security threat. The challenge of Web services security is to understand and consider the risks of securing a Web-based service depending on the existing security techniques and simultaneously follow evolving standards in order to fill the gap in Web services security. However, the performance of the security mechanisms is fraught with concerns due to additional security contents in SOAP messages, the higher number of message exchanges to establish trust, as well as the extra CPU time to process these additions. As the interaction between service providers and requesters occurs via XML-based SOAP messages, securing Web services tends to make these messages longer than they would be otherwise and consequently requires interpretation by XML parsers on both sides, which reduces the performance of Web services. The work described in this thesis can be broadly divided into three parts, the first of which is studying and comparing the performance of various security profiles applied on a Web service tested with different initial message sizes. The second part proposes a multi-criteria decision making framework to aid Web services developers and architects in selecting the best suited security profile that satisfies the different requirements of a given application during the development process in a systematic, manageable, and effective way. The proposed framework, based on the Analytical Hierarchy Process (AHP) approach, incorporates not only the security requirements, but also the performance considerations as well as the configuration constraints of these security profiles. The framework is then validated and evaluated using a scenario-driven approach to demonstrate situations where the decision making framework is used to make informed decisions to rank various security profiles in order to select the most suitable one for each scenario. Finally, the last part of this thesis develops a novel steganography method to be used for SOAP messages within Web services environments. This method is based on changing the order of XML elements according to a secret message. This method has a high imperceptibility; it leaves almost no trail because it uses the communication protocol as a cover medium, and keeps the structure and size of the SOAP message intact. The method is empirically validated using a feasible scenario so as to indicate its utility and value.
66

Centrifugal compressor modeling development and validation for a turbocharger component matching system

Erickson, Christopher Erik January 1900 (has links)
Master of Science / Department of Mechanical and Nuclear Engineering / Kirby S. Chapman / This thesis outlines the development of a centrifugal compressor model for the Turbocharger Component Matching System (TuCMS) software package that can be used to inexpensively analyze turbocharger performance. The TuCMS can also be used to match turbocharger components to integrate and optimize turbocharger-engine performance. The software system is being developed with the intent to reduce the time taken to experimentally match a turbocharger with an engine, a task that is key to engine emission reductions. The TuCMS uses one-dimensional thermo-fluid equations to analyze the compressor side of a turbocharger. For each compressor component, the program calculates the velocities, pressures, temperatures, pressure losses, work consumption, and efficiencies for a specified set of turbocharger geometry, atmospheric conditions, rotational speed, and fluid mass flow rate. The compressor includes established loss models found in the open literature. The TuCMS utilizes a component-based architecture to simplify model enhancements. The TuCMS can be used as a cost effective engineering tool for preliminary turbocharger testing during engine upgrades and modifications. In this thesis, the TuCMS compressor model was used as an analysis tool to further understand the Variable Geometry Turbocharger (VGT) experimental results. The VGT is a unique turbocharger that can change the diffuser vane angle over a wide range of positions. The change in diffuser vane angle results in optimal turbocharger performance at various operating conditions, and potentially increases the operating range. The purpose for the use of the TuCMS compressor model analysis is to identify the change in performance as the diffuser vane angles are adjusted. The TuCMS can ideally be used as a control program for the VGT to adjust the diffuser vane angles as the compressor load changes and insure the compressor is operating at the highest efficiency.
67

Moving-Average Transient Model for Predicting the Back-surface Temperature of Photovoltaic Modules

January 2020 (has links)
abstract: The operating temperature of photovoltaic (PV) modules has a strong impact on the expected performance of said modules in photovoltaic arrays. As the install capacity of PV arrays grows throughout the world, improved accuracy in modeling of the expected module temperature, particularly at finer time scales, requires improvements in the existing photovoltaic temperature models. This thesis work details the investigation, motivation, development, validation, and implementation of a transient photovoltaic module temperature model based on a weighted moving-average of steady-state temperature predictions. This thesis work first details the literature review of steady-state and transient models that are commonly used by PV investigators in performance modeling. Attempts to develop models capable of accounting for the inherent transient thermal behavior of PV modules are shown to improve on the accuracy of the steady-state models while also significantly increasing the computational complexity and the number of input parameters needed to perform the model calculations. The transient thermal model development presented in this thesis begins with an investigation of module thermal behavior performed through finite-element analysis (FEA) in a computer-aided design (CAD) software package. This FEA was used to discover trends in transient thermal behavior for a representative PV module in a timely manner. The FEA simulations were based on heat transfer principles and were validated against steady-state temperature model predictions. The dynamic thermal behavior of PV modules was determined to be exponential, with the shape of the exponential being dependent on the wind speed and mass per unit area of the module. The results and subsequent discussion provided in this thesis link the thermal behavior observed in the FEA simulations to existing steady-state temperature models in order to create an exponential weighting function. This function can perform a weighted average of steady-state temperature predictions within 20 minutes of the time in question to generate a module temperature prediction that accounts for the inherent thermal mass of the module while requiring only simple input parameters. Validation of the modeling method presented here shows performance modeling accuracy improvement of 0.58%, or 1.45°C, over performance models relying on steady-state models at narrow data intervals. / Dissertation/Thesis / Masters Thesis Engineering 2020
68

Physical layer security in emerging wireless transmission systems

Bao, Tingnan 06 July 2020 (has links)
Traditional cryptographic encryption techniques at higher layers require a certain form of information sharing between the transmitter and the legitimate user to achieve security. Besides, it also assumes that the eavesdropper has an insufficient computational capability to decrypt the ciphertext without the shared information. However, traditional cryptographic encryption techniques may be insufficient or even not suit- able in wireless communication systems. Physical layer security (PLS) can enhance the security of wireless communications by leveraging the physical nature of wireless transmission. Thus, in this thesis, we study the PLS performance in emerging wireless transmission systems. The thesis consists of two main parts. We first consider the PLS design and analysis for ground-based networks em- ploying random unitary beamforming (RUB) scheme at the transmitter. With RUB technique, the transmitter serves multiple users with pre-designed beamforming vectors, selected using limited channel state information (CSI). We study multiple-input single-output single-eavesdropper (MISOSE) transmission system, multi-user multiple-input multiple-output single-eavesdropper (MU-MIMOSE) transmission system, and massive multiple-input multiple-output multiple-eavesdropper (massive MI- MOME) transmission system. The closed-form expressions of ergodic secrecy rate and the secrecy outage probability (SOP) for these transmission scenarios are derived. Besides, the effect of artificial noise (AN) on secrecy performance of RUB-based transmission is also investigated. Numerical results are presented to illustrate the trade-off between performance and complexity of the resulting PLS design. We then investigate the PLS design and analysis for unmanned aerial vehicle (UAV)-based networks. We first study the secrecy performance of UAV-assisted relaying transmission systems in the presence of a single ground eavesdropper. We derive the closed-form expressions of ergodic secrecy rate and intercept probability. When multiple aerial and ground eavesdroppers are located in the UAV-assisted relaying transmission system, directional beamforming technique is applied to enhance the secrecy performance. Assuming the most general κ-μ shadowed fading channel, the SOP performance is obtained in the closed-form expression. Exploiting the derived expressions, we investigate the impact of different parameters on secrecy performance. Besides, we utilize a deep learning approach in UAV-based network analysis. Numerical results show that our proposed deep learning approach can predict secrecy performance with high accuracy and short running time. / Graduate
69

On the Performance Analysis of Free-Space Optical Links under Generalized Turbulence and Misalignment Models

Al-Quwaiee, Hessa 11 1900 (has links)
One of the potential solutions to the radio frequency (RF) spectrum scarcity problem is optical wireless communications (OWC), which utilizes the unlicensed optical spectrum. Long-range outdoor OWC are usually referred to in the literature as free-space optical (FSO) communications. Unlike RF systems, FSO is immune to interference and multi-path fading. Also, the deployment of FSO systems is flexible and much faster than optical fibers. These attractive features make FSO applicable for broadband wireless transmission such as optical fiber backup, metropolitan area network, and last mile access. Although FSO communication is a promising technology, it is negatively affected by two physical phenomenon, namely, scintillation due to atmospheric turbulence and pointing errors. These two critical issues have prompted intensive research in the last decade. To quantify the effect of these two factors on FSO system performance, we need effective mathematical models. In this work, we propose and study a generalized pointing error model based on the Beckmann distribution. Then, we aim to generalize the FSO channel model to span all turbulence conditions from weak to strong while taking pointing errors into consideration. Since scintillation in FSO is analogous to the fading phenomena in RF, diversity has been proposed too to overcome the effect of irradiance fluctuations. Thus, several combining techniques of not necessarily independent dual-branch free-space optical links were investigated over both weak and strong turbulence channels in the presence of pointing errors. On another front, improving the performance, enhancing the capacity and reducing the delay of the communication link has been the motivation of any newly developed schemes, especially for backhauling. Recently, there has been a growing interest in practical systems to integrate RF and FSO technologies to solve the last mile bottleneck. As such, we also study in this thesis asymmetric an RF-FSO dual-hop relay transmission system with both fixed and variable gain relay.
70

Power utility systems modelling and performance analysis

Percale, E.U. January 1990 (has links)
Any business striving to improve its productivity, must first establish and practise at all levels a universal method for measurement and analysis of its performance. A prerequisite for any analysis, is an appropriate definition of the system which is to be analysed. The rationale and derivation process for such system definition, is termed "modelling", and its product a "model". Deterministic Productivity Accounting (DPA), is a comparative analysis method for business performance. It is based on the premise that business performance is primarily determined by resource management, and measured in terms of productivity. By judicious partitioning and modelling of the business systems, and careful counting and accounting for every variance component, one traces the driving causes behind the apparent performance. This work combiaes modelling of power utility systems with the application of DPA, into an integrated method for performance measurement and analysis within a power utiljty, especially in a power station. / Dissertation (MEng)--University of Pretoria, 1990. / gm2014 / Industrial and Systems Engineering / unrestricted

Page generated in 0.0875 seconds