• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 369
  • 356
  • 40
  • 34
  • 34
  • 32
  • 30
  • 28
  • 8
  • 7
  • 6
  • 4
  • 4
  • 3
  • 2
  • Tagged with
  • 1073
  • 1073
  • 331
  • 274
  • 193
  • 134
  • 117
  • 98
  • 92
  • 91
  • 77
  • 75
  • 73
  • 72
  • 65
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
251

Utilização de web semântica para seleção de informações de web services no registro UDDI uma abordagem com qualidade de serviço / The use of semantic web for selection of web services information in the UDDI registration an approach with quality service

Nakamura, Luis Hideo Vasconcelos 15 February 2012 (has links)
Este projeto de mestrado aborda a utilização de recursos daWeb Semântica na seleção de informações sobre Web Services no registro UDDI (Universal Description, Discovery, and Integration). Esse registro possui a limitação de apenas armazenar informações funcionais de Web Services. As informações não funcionais que incluem as informações de qualidade de serviço (QoS - Quality of Service) não são contempladas e dessa forma dificulta a escolha do melhor serviço pelos clientes. Neste projeto, a representação da base de conhecimento com informações sobre os provedores, clientes, acordos, serviços e a qualidade dos serviços prestados foi feita por meio de uma ontologia. Essa ontologia é utilizada pelo módulo UDOnt-Q (Universal Discovery with Ontology and QoS) que foi projetado para servir de plataforma para algoritmos de busca e composição de serviços com qualidade. Embora a utilização de semântica possa ser empregada para a composição e automatização de serviços, o foco deste trabalho é a garantia de qualidade de serviço em Web Services. Os algoritmos desenvolvidos empregam recursos da Web Semântica para classificar e selecionar os Web Services adequados de acordo com as informações de qualidade que estão armazenados na ontologia. O módulo e os algoritmos foram submetidos a avaliações de desempenho que revelaram problemas de desempenho com relação a abordagem adotada durante o processo de inferência da ontologia. Tal processo é utilizado para a classificação das informações dos elementos presentes na ontologia. Contudo, uma vez que as informações foram inferidas, o processo de busca e seleção de serviços comprovou a viabilidade de utilização do módulo e de um dos seus algoritmos de seleção / This master project addresses the use of Semantic Web resources in the selection of information about Web Services in UDDI registry (Universal Description, Discovery, and Integration). This registry has the limitation of only storing functional information of Web Services. The nonfunctional information that includes the quality of service information (QoS - Quality of Service) is not covered and thus it is complicate to choose the best service for customers. In this project, the representation of the knowledge base with information about the providers, customers, agreements, services and quality of services has been made through an ontology. This ontology is used by the module UDOnt-Q (Universal Discovery with Ontology and QoS) that was designed to serve as a platform for search algorithms and composition of services with quality. Although the use of semantics can be adopted for the composition and automation of services, the focus of this work is to guarantee quality of service in Web Services. The developed algorithms employ SemanticWeb resources to classify and select the appropriate Web Services according to the quality information that is stored in the ontology. The module and the algorithms have been subjected to performance evaluations that revealed performance problems in relation to the approach taken during the ontology inference process. This process is used for classification of information of the elements present in the ontology. However, since the information was inferred, the process of search and selection services proved the viability of using the module and one of its selection algorithms
252

Optimisation et évaluation des performance en traitement d'image / Optimisation and Performance Evaluation in image registration technique

Mambo, Shadrack 19 October 2018 (has links)
Résumé : Thèse de DoctoratL’importance de l’imagerie médicale comme élément principal dans plusieurs application médicales et diagnostiques de soin de santé est indispensable. L’intégration des données utiles obtenues des diverses images est vitale pour une analyse appropriée de l’information contenues dans ces images sous observation. Pour que ce processus d’intégration réussisse, une procédure appelée recalage d’image est nécessaire.Le but du recalage d’image consiste à aligner deux images afin de trouver une transformation géométrique qui place une des deux images dans la meilleure correspondance spatiale possible avec l’autre image en optimisant un critère de recalage. Les deux images sont dites image cible et image source. Les méthodes de recalage d’image consistent à avoir référencées par des points de contrôle. Ceci est suivi d’une transformation de recalage qui associe les deux images et d’une fonction définie sur base de la mesure de similarité qui a pour but de mesurer la valeur qualitative de proximité ou encore de degré de concordance entre l’image cible et l’image source. Enfin, un optimisateur qui cherche à trouver la transformation optimale au sein du champ de solution de la recherche, est appliqué.Cette recherche présente un algorithme automatique de recalage d’image dont le but est de résoudre le problème du recalage d’image à multiple modes sur une paire des clichés de tomographie par ordinateur (CT) faite sur les poumons. Dans cette méthode, une étude comparative entre la méthode classique d’optimisation par algorithme du gradient à pas fixe et celle de l’algorithme évolutionniste est menée. L’objectif de cette recherche est d’effectuer l’optimisation par des techniques de recalage d’image ainsi qu’évaluer la performance de ces mêmes techniques afin de doter les spécialistes du domaine médical d’une estimation de combien précis et robuste le processus de recalage serait. Les paires des clichés obtenues de la tomographie par ordinateur faite sur les poumons sont recalées en utilisant l’information mutuelle comme mesure de similarité, la transformation affine ainsi que l’interpolation linéaire. Un optimisateur qui recherche la transformation optimale au sein de l’espace de recherche est appliqué afin de minimiser la fonction économique (fonction objectif).Les études de détermination d’un modèle de transformation qui dépend des paramètres de transformation et de l’identification des mesures de similarité basée sur l’intensité du voxel ont été menées. En alignant la transformation avec les points de control, trois modèles de transformation sont comparés. La transformation affine produit la meilleure reconstitution d’image en comparaison aux transformations non réfléchissantes et projectives. Les résultats de cette recherche sont assez comparables à celles rapportées dans le challenge de recherche EMPIRE 10 et sont conformes à la fois aux principes théoriques aussi bien qu’aux applications pratiques.La contribution de cette recherche réside dans son potentiel à améliorer la compréhension scientifique du recalage d’image des organes anatomiques du corps humain. Cette recherche établie ainsi une base pour une recherche avancée sur l’évaluation de la performance des techniques de recalage et la validation des procédures sur d’autres types d’algorithmes et domaines d’application du recalage d’images comme la détection, la communication par satellite, l’ingénierie biomédicale, la robotique, les systèmes d'information géographique (SIG) et de localisation parmi tant d’autres / D’Tech Thesis SummaryThe importance of medical imaging as a core component of several medical application and healthcare diagnosis cannot be over emphasised. Integration of useful data acquired from different images is vital for proper analysis of information contained in the images under observation. For the integration process to be successful, a procedure referred to as image registration is necessary.The purpose of image registration is to align two images in order to find a geometric transformation that brings one image into the best possible spatial correspondence with another image by optimising a registration criterion. The two images are known as the target image and the source image. Image registration methods consist of having the two images referenced with control points. This is followed by a registration transformation that relates the two images and a similarity metric function that aims to measure the qualitative value of closeness or degree of fitness between the target image and the source image. Finally, an optimiser which seeks an optimal transformation inside the defined solution search space is performed.This research presents an automated image registration algorithm for solving multimodal image registration on lung Computer Tomography (CT) scan pairs, where a comparison between regular step gradient descent optimisation technique and evolutionary optimisation was investigated. The aim of this research is to carry out optimisation and performance evaluation of image registration techniques in order to provide medical specialists with estimation on how accurate and robust the registration process is. Lung CT scan pairs are registered using mutual information as a similarity measure, affine transformation and linear interpolation. In order to minimise the cost function, an optimiser, which seeks the optimal transformation inside the defined search space is applied.Determination of a transformation model that depends on transformation parameters and identification of similarity metric based on voxel intensity were carried out. By fitting transformation to control points, three transformation models were compared. Affine transformation produced the best recovered image when compared to non-reflective similarity and projective transformations. The results of this research compares well with documented results from EMPIRE 10 Challenge research and conforms to both theoretical principles as well as practical applications.The contribution of this research is its potential to increase the scientific understanding of image registration of anatomical body organs. It lays a basis for further research in performance evaluation of registration techniques and validation of procedures to other types of algorithms and image registration application areas, such as remote sensing, satellite communication, biomedical engineering, robotics, geographical information systems and mapping, among others
253

On the Bleeding Edge : Debloating Internet Access Networks

Høiland-Jørgensen, Toke January 2016 (has links)
As ever more devices are connected to the internet, and applications turn ever more interactive, it becomes more important that the network can be counted on to respond reliably and without unnecessary delay. However, this is far from always the case today, as there can be many potential sources of unnecessary delay. In this thesis we focus on one of them: Excess queueing delay in network routers along the path, also known as bufferbloat. We focus on the home network, and treat the issue in three stages. We examine latency variation and queueing delay on the public internet and show that significant excess delay is often present. Then, we evaluate several modern AQM algorithms and packet schedulers in a residential setting, and show that modern AQMs can almost entirely eliminate bufferbloat and extra queueing latency for wired connections, but that they are not as effective for WiFi links. Finally, we go on to design and implement a solution for bufferbloat at the WiFi link, and also design a workable scheduler-based solution for realising airtime fairness in WiFi. Also included in this thesis is a description of Flent, a measurement tool used to perform most of the experiments in the other papers, and also used widely in the bufferbloat community. / HITS, 4707
254

Improving the Performance of the Eiffel Event Persistence Solution / 提高EIFFEL事件持久性解决方案的性能

Hellenberg, Rickard January 2019 (has links)
Deciding which database management system (DBMS) to use has perhaps never been harder. In recent years there has been an explosive growth of new types of database management systems that address different issues and performs well for different scenarios. This thesis is an improving case study of an Event Persistence Solution for the Eiffel Framework, which is a framework used for achieving traceability in very-large-scale systems development. The purpose of this thesis is to investigate whether it is possible to improve the performance of the Eiffel Event Persistence Solution by changing from MongoDB, to Elasticsearch or ArangoDB. Experiments were conducted to measure the request throughput for 4 types of requests. As a prerequisite to measuring the performance, support for the different DBMSs and the possibility to change between them was implemented. The results showed that Elasticsearch performed better than MongoDB in terms of nested-document-search as well as for graph-traversal operations. ArangoDB had even better performance for graph-traversal operations but had an inadequate performance for nested-document-search. / 决定使用哪个数据库管理系统(DBMS)可能从未如此困难过。近年来,新型数据库管理系统呈现爆炸式增长,它们解决了不同的问题,并在不同的情境中表现出优异性能。本论文是针对Eiffel框架的事件持久性解决方案的改进案例研究,该框架被用于实现超大规模系统开发中的可追溯性。本文的目的是研究是否可以通过摒弃MongoDB并改用Elasticsearch或ArangoDB来提高Eiffel事件持久性解决方案的性能。为测量4种类型的请求的请求吞吐量进行了实验。作为衡量性能的前提条件,实施了对不同数据库管理系统(可在这些系统之间进行更换)的支持。结果表明,Elasticsearch在嵌套文档搜索和图形遍历操作方面的性能均优于MongoDB。 ArangoDB在图形遍历操作方面具有比前者更好的性能,但在嵌套文档搜索方面的性能不佳。
255

Sequential Probing With a Random Start

Miller, Joshua 01 January 2018 (has links)
Processing user requests quickly requires not only fast servers, but also demands methods to quickly locate idle servers to process those requests. Methods of finding idle servers are analogous to open addressing in hash tables, but with the key difference that servers may return to an idle state after having been busy rather than staying busy. Probing sequences for open addressing are well-studied, but algorithms for locating idle servers are less understood. We investigate sequential probing with a random start as a method for finding idle servers, especially in cases of heavy traffic. We present a procedure for finding the distribution of the number of probes required for finding an idle server by using a Markov chain and ideas from enumerative combinatorics, then present numerical simulation results in lieu of a general analytic solution.
256

Factors contributing to the performance of the Roads Agency Limpopo in terms of roads infrastructure delivery

Rapetsoa, Molatelo January 2011 (has links)
Thesis (M.Dev.) --University of Limpopo, 2011 / The overall aim of the study was to investigate the factors contributing to RAL’s performance on roads infrastructure delivery in Limpopo Province for the first ten years since its inception. The study also sought to assess RAL’s steadiness, looking at its current practices and processes versus key external and internal developmental challenges facing the construction industry, in particular the civil engineering profession. Descriptive explorative research design using an instrumental case study was used in this research project to achieve the aim of the study. The population comprised all the people and companies involved with the RAL projects in Limpopo Province, including RAL staff, CIBD active registered Civil Engineering Contracting and consulting firms. Unstructured telephonic interviews and a structured questionnaire assisted the researcher to understand factors contributing to RAL’s performance in roads. The study revealed several strengths and weaknesses contributing to RAL’s performance, which concerned the agency‘s nature and composition, level of implementation of its strategies, policies and plans. A clear and implementable quality assurance system must be developed to mitigate all risks associated with project management processes and procedures. Strategies should also be identified on how the agency’s performance will progress despite the economic instability and political interferences. Proper strategic planning has thus far proved to be prudent in identifying risks and ways to mitigate them. According to the study RAL, an organisation of its own type specialising in roads infrastructure, its nature and size, its vast experience and knowledge of construction and project organisation and Project managers’ experience, competence and commitment to finishing the project with time ,cost and budget, have emerged as key ingredients assisting it in becoming an organisation with its proven record. The study also demonstrated RAL’s strength in terms of its communication, control and dedication in managing projects. However too many tenders are issued within a short space of time which makes contractors unable to adequately respond to them. RAL has demonstrated that it is negatively affected by an unstable economic environment. As a result construction materials are always aligned to inflation. To that end RAL should constantly monitor market conditions to synchronize its activities to 5 rightfully position themselves as these are external factors that are unavoidable, but require a hands-on approach
257

Uma abordagem para a integração de diferentes fontes de dados provenientes de redes de sensores no contexto de internet das coisas / An Approach for the Integration of Different Data Sources from Sensor Networks in the Internet of Things Context

Barros, Vinícius Aires 05 August 2019 (has links)
Internet das Coisas (IoT) tem como principal característica a conexão em rede entre dispositivos como sensores, smartphones e wearables, com a finalidade de coletar informações dos ambientes físicos em que se encontram. Um desafio relacionado é a falta de padronização da comunicação entre dispositivos e de um mecanismo que realiza o processamento, armazenamento e recuperação de dados de forma simplificada, sendo, portanto, um grande desafio a interoperabilidade. Esta dissertação introduz o Internet of Things Data as a Service Module (IoTDSM), uma abordagem que baseia-se no modelo de Data as a Service (DaaS), que tem como objetivo auxiliar a gestão de fontes de dados de sensores heterogêneos, que independem da sua origem, formato ou sistema de banco de dados utilizado. Além disso, é apresentado o Internet of Things Multi-Protocol Message Broker (IoTM2B), uma extensão do IoTDSM que permite a integração com diferentes protocolos de comunicação utilizados em redes de sensores. Neste sentido, para avaliação desta pesquisa foi utilizado a metodologia de avaliação de desempenho, a qual contribuiu para a identificação das limitações dos mecanismos propostos. Além disso, neste trabalho diferentes cenários de avaliação foram conduzidas: (i) avaliação de desempenho do Middleware Global Sensor Network (GSN), que auxiliou na definição dos demais cenários de avaliação abordados; (ii) avaliação de desempenho IoTDSM utilizando diferentes sistemas de bancos de dados (PostgreSQL ou MongoDB) e formatos de dados (JSON ou XML), para o processamento de dados climáticos; (iii) avaliação de desempenho uma estratégia IoTM2B a qual permitiu a integração do IoTDSM com os protocolos HTTP, MQTT e CoAP em um ambiente de comunicação Machine-to-Machine (M2M) e Computação em Nuvem; e, (iv) avaliação de uma arquitetura que realiza a classificação de emoções de usuários em um ambiente de casa inteligente. Por fim, é feito uma discussão sobre os resultados obtidos, além de demonstrar a possibilidade da integração do IoTDSM e IoTM2B com diferentes formatos de dados, estratégias de armazenamento e protocolos de comunicação. / Internet of Things (IoT) has a primary characteristic of the network connection among devices e.g., sensors, smartphones, and wearables, in order to collect information about the physical environments. A related challenge is the non-standardization of communication among devices and a mechanism that performs data processing, storage, and retrieval in a simplified way, hence interoperability a big problem. This thesis introduces the Internet of Things Data as a Service Module (IoTDSM), an approach that is based on the Data as a Service (DaaS) model, which aims to assist the management of heterogeneous sensor data sources, regardless the data source, format or database system utilized. In addition, the Internet of Things Multi-Protocol Message Broker (IoTM2B) is presented, an extension of IoTDSM that allows integration with different sensor networks communication protocols. The performance evaluation methodology was utilized to evaluate this research, which contributed to the identification of the limitations of the proposed mechanisms. Furthermore, different evaluation scenarios were employed: (i) performance evaluation of the Global Sensor Network (GSN), which assisted the execution of the others evaluation scenarios; (ii) performance evaluation of IoTDSM employing different database systems (PostgreSQL or MongoDB) and input/output data formats (JSON or XML) for the processing world wide climatic data; (iii) performance evaluation of IoTM2B strategy which allowed the integration of IoTDSM with the HTTP, MQTT and CoAP protocols in a Machine-to-Machine (M2M) and Cloud Computing environment; and (iv) evaluation of an architecture for emotion recognition on smart home environment. Finally, a discussion is made about the results obtained, besides demonstrating the possibility of the integration of IoTDSM and IoTM2B with different data formats, storage strategies, and communication protocols.
258

Design and Evaluation of the Combined Input and Crossbar Queued (CICQ) Switch

Yoshigoe, Kenji 09 August 2004 (has links)
Packet switches are used in the Internet to forward information between a sender and receiver and are the critical bottleneck in the Internet. Without faster packet switch designs, the Internet cannot continue to scale-up to higher data rates. Packet switches must be able to achieve high throughput and low delay. In addition, they must be stable for all traffic loads, must efficiently support variable length packets, and must be scalable to higher link data rates and greater numbers of ports. This dissertation investigates a new combined input and crossbar queued (CICQ) switch architecture. Some unbalanced traffic loads result in instability for input queued (IQ) and CICQ switches. This instability region was modeled, and the cause of the instability was found to be a lack of work conservation at one port. A new burst stabilization protocol was investigated that was shown to stabilize both IQ and CICQ switches. As an added benefit, this new protocol did not require a costly internal switch speed-up. Switching variable length packets in IQ switches requires the segmentation of packets into cells. The process also requires an internal switch speed-up which can be costly. A new method of cell-merging in IQ switches reduced this speed-up. To improve fairness for CICQ switches, a block and transfer method was proposed and evaluated. Implementation feasibility of the CICQ switch was also investigated via a field programmable gate array (FPGA) implementation of key components. Two new designs for round robin arbiters were developed and evaluated. The first of these, a proposed priority-encoder-based round robin arbiter that uses feedback masking, has a lower delay than any known design for an FPGA implementation. The second, an overlapped round robin arbiter design that fully overlaps round robin polling and scheduling, was proposed and shown to be scalable, work conserving, and fair. To allow for multi-cabinet implementation and minimization of the size of the cross point buffers, a distributed input port queue scheduler was investigated. This new scheduler minimizes the amount of buffering needed within the crossbar. The two primary contributions of this dissertation are 1) a complete understanding of the performance characteristics of the CICQ switch, and 2) new methods for improving the performance, stability,and scalability of the CICQ switch. This work has shown that the CICQ switch can be the switch architecture of the future.
259

Stereotype threat behind the wheel

Yeung, Nai Chi, Psychology, Faculty of Science, UNSW January 2006 (has links)
Stereotype threat refers to the pressure that individuals feel when they are at risk of confirming a demeaning stereotype about themselves. Research has found that stereotype threat impairs performance on cognitive-based tasks by inducing mental interference (e.g., Schmader & Johns, 2003). This thesis hypothesised that this finding would generalise to driving and that drivers who are better able to inhibit cognitive interference (i.e., with better inhibitory ability) would be less susceptible to the disruptive effect of stereotype threat than drivers who are less able (i.e., with poorer inhibitory ability). A series of three experiments conducted in a driving simulator tested the predictions using the gender stereotype of driving skills and investigated the interpretation of the results. The experiments revealed that stereotype threat exerted both a facilitative and debilitative influence on driving performance, as indicated by different performance measures. The facilitative effect diminished when drivers experienced increased mental demands or when they were assessed by an unexpected performance measure, while the debilitative effect was more likely observed among drivers who received negative feedback than drivers who received positive feedback. Moreover, the results supported the prediction that inhibitory ability would moderate the detrimental impact of stereotype threat as the performance of drivers with poorer inhibitory ability was impeded more than that of drivers with better inhibitory ability. Regarding the processes underlying the present findings, the experiments provided suggestive evidence that stereotype threat elicits cognitive interference and simultaneously motivates drivers to concentrate on particular performance areas in an attempt to refute the stereotype. In combination, these processes appear to be at least partly responsible for the performance deficits and boosts observed.
260

New Directions in Project Performance and Progress Evaluation

Bower, Douglas, not supplied January 2007 (has links)
Dr. Bower confirmed that earned value management (EVM) is not widely adopted, and that many project managers see the methodology as being overly complex and difficult to implement. He identified several serious challenges associated with conventional EVM.and addressed the first issue by creating a new theoretical concept called Assured Value Analysis (AVA). This add-in process provides two new measures, permitting improvements to EVM that take into account the added certainty provided through procurement. Assured Value (AV) represents the budget for a future signed contract, and Expected Cost (EC) represents the agreed cost of that contract. Those measures permit the calculation of a Total Cost Variance that includes not only cost deviations to date, but also future ones to which the project team is already committed. AVA also allows conventional EVM formulae to take into account the Assured Value and Expected Cost of future signed agreements. A simple notional project is used to demonstrate the implementation of AVA. He resolved the remaining challenges and issues through realising that the isolation of project phases would provide a simplified but more dependable methodology, one that also provides features not found in conventional EVM. Significant milestones are normally planned to occur at the end of a project phase. By assessing project performance only at the end of each completed phase, performance calculations are significantly simplified.. His new technique, Phase Earned Value Analysis (PEVA) simplifies the calculation of PV, EV and AC, and also provides benefits that are not possible with EVM. Since the planned and actual phase completion dates are known, an intuitively simple but accurate time-based schedule variance and schedule performance index (i.e. SVP and SPIP) can be measured. PEVA also permits the forecasting of future phase end cost figures and phase completion dates using the phase CPI and SPI ratios. Since PEVA employs data points having specific x axis and y-axis values, those can be readily plotted and trend lines identified with standard spreadsheet functions. This is a powerful feature, as it allows key project stakeholders to visualise emerging project performance trends as each phase is completed. Finally, he successfully combined the AVA and PEVA concepts, resulting in a new EVM methodology - Phase Assured Value Analysis (PAVA) - which takes into account the assurance provided by procurement, simplifies the calculation of earned value through phases, and provides powerful forecasting and charting features. He validated this new combined approach in multiple respects. The new AVA and PEVA formulae were rigorously established and confirmed through standard algebraic procedures. The formulae were tested in sample project situations, to clearly demonstrate their functions. He argues that the PAVA approach conforms to the 32 criteria established in the United States for full EVM compliance. He presented AVA and PEVA to critical audiences at major project management conferences in North America and the UK, as well as several gaining expert criticism from organisations and practitioners. Finally, he used archived cost and schedule records to retrospectively test the combined PAVA methodology on a significant office facilities and technology program.

Page generated in 0.1192 seconds