• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 492
  • 92
  • 71
  • 61
  • 36
  • 21
  • 19
  • 18
  • 13
  • 6
  • 5
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 1014
  • 679
  • 258
  • 180
  • 130
  • 125
  • 117
  • 96
  • 81
  • 80
  • 79
  • 77
  • 66
  • 63
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Análise do problema do Protocolo MAC IEEE 802.11 em redes Ad Hoc Multihop

ALMEIDA, Adalton de Sena January 2003 (has links)
Made available in DSpace on 2014-06-12T15:58:29Z (GMT). No. of bitstreams: 2 arquivo4635_1.pdf: 928763 bytes, checksum: f0830b47cdbb5936c9f93e74336e98d4 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2003 / O protocolo MAC IEEE 802.11 DFWMAC (Distributed Foundation Wireless Access Control) foi padronizado para uso em redes locais sem fio e tem sido utilizado para testar e simular redes locais sem fio ad hoc multihop. Este protocolo tem apresentado problemas quando trabalhamos com redes ad hoc multihop. Este problema fica evidente quando submetemos tráfego TCP (Transmission Control Protocol) entre duas estações. Por tratar-se de um protocolo de controle de acesso ao meio distribuído, não possuindo um controle central, a decisão de transmissão é feita pelas próprias estações de acordo com o funcionamento do DFWMAC. Ainda pelas suas características de funcionamento distribuído, problemas de terminal escondido e terminal exposto podem ocorrer comprometendo de maneira significativa o tráfego de conexões TCP. Associado aos problemas de terminal escondido e terminal exposto , o algoritmo de Backoff Exponencial Binário (BEB) contribui para que este protocolo não funcione bem em redes ad hoc multihop. O resultado da ação de todos estes problemas é a degradação do throughput do TCP gerando instabilidade e injustiça no acesso ao meio compartilhado. A instabilidade fica evidente quando a variação do throughput é muito alta em intervalos de tempo muito curtos. Isto pode ser visto com apenas uma conexão TCP entre duas estações. Já o problema de injustiça aparece quando submetemos duas conexões TCP simultâneas, sendo que uma consegue transmitir pacotes de dados a uma taxa alta, utilizando toda a largura de banda, enquanto a outra conexão não consegue transmitir nenhum pacote permanecendo com o throughput zero durante o tempo em que as duas conexões estão ativas. Este trabalho propõe uma solução para lidar com estes problemas
152

Sustainable Throughput – QoE Perspective

Darisipudi, Veeravenkata Naga S Maniteja January 2017 (has links)
In recent years, there has been a significant increase in the demand for streaming of high quality videos on the smart mobile phones. In order to meet the user quality requirements, it is important to maintain the end user quality while taking the resource consumption into consideration. This demand caught the attention of the research communities and network providers to prioritize Quality of Experience (QoE) in addition to the Quality of Service (QoS). In order to meet the users’ expectations, the QoE studies have gained utmost importance, thus creating the challenge of evaluating it in such a way that the quality, cost and energy consumption are taken into account. This gave way to the concept of QoE-aware sustainable throughput, which denotes the maximal throughput at which QoE problems can be still kept at a desired level. The aim of the thesis is to determine the sustainable throughput values from the QoE perspective. The values are observed for different delay and packet loss values in wireless and mobile scenarios. The evaluation is done using the subjective video quality assessment method. In the subjective assessment method, the evaluation is done using the ITU-T recommended Absolute Category Rating (ACR). The video quality ratings are taken from the users, and are then averaged to obtain the Mean Opinion Score (MOS). The obtained scores are used for analysis in determining the sustainable throughput values from the users’ perspective. From the results it is determined that, for all the video test cases, the videos are rated better quality at low packet loss values and low delay values. The quality of the videos with the presence of delay is rated high compared to the video quality in the case of packet loss. It was observed that the high resolution videos are feeble in the presence of higher disturbances i.e. high packet loss and larger delays. From considering all the cases, it can be observed that the QoE disturbances due to the delivery issues is at an acceptable minimum for the 360px video. Hence, the 480x360 video is the threshold to sustain the video quality.
153

Performance evaluation of Cassandra in AWS environment : An experiment

SUBBA REDDY GARI, AVINASH KUMAR REDDY January 2017 (has links)
Context. In the field of computer science, the concept of cloud computing plays a prominent role which can be hosted on the internet to store, manage and also to process the data. Cloud platforms enables the users to perform large number of computing tasks across the remote servers. There exist several cloud platform providers like Amazon, Microsoft, Google, Oracle and IBM. Several conventional databases are available in cloud service providers in order to handle the data. Cassandra is a NoSQL database system which can handle the unstructured data and can scale large number of operations per second even across multiple datacentres. Objectives. In this study, the performance evaluation of NoSQL database in AWS cloud service provider has been performed. The performance evaluation of a three node Cassandra cluster is performed for different configuration of EC2 instances. This performance has been evaluated using the metrics throughput and CPU utilization. The main aim of this thesis was to evaluate the performance of Cassandra under various configurations with the YCSB benchmarking tool. Methods. A literature review has been conducted to gain more knowledge about the current research area. The metrics required to evaluate the performance of Cassandra were identified through literature study. The experiment was conducted to compute the results for throughput and CPU utilization under the different configurations t2.micro, t2.medium and t2.small for 3 node and 6 node cluster using YCSB benchmarking tool. Results. The results of the experiment include the metrics, throughput and CPU utilization which were identified in the literature review. The results calculated were plotted as graphs to compare their performance for three different configurations. The results obtained were segregated as two different scenarios which were for 3 node and 6 node clusters. Conclusions. Based on the obtained values of throughput the optimal or sub-optimal configuration of a data centre running multiple instances of Cassandra such that the specific throughput requirements are satisfied.
154

Integration and Evaluation of IoT Hardware and Software platforms

Xu, Ting January 2017 (has links)
The Internet of Things (IoT) is growing rapidly these years and the influence of IoT on everyday life and behavior are also increasing. It is a network that connects physical devices, vehicles, buildings, and other items and embedded with electronics, software, sensors, actuators, and network connectivity so that these objects can collect and exchange data. It has been utilized in lots of domains, such as transportation and logistics domain, healthcare domain, smart environment domain, personal and social domain. It is estimated that the IoT will consist of almost 50 billion objects by 2020. IoT Gateway is really important in IoT, which can bridge traditional communication networks with sensor networks to make the network communication easier. IoT communication is of vital importance in today’s life. This study has the aim of integrating and evaluating of IoT gateways and IoT communication systems. It proposes a scenario where the IoT gateway connects to an actuator in order to control the actuator and transmit the data via the IoT communication system, creates a demonstrator by setting up the communication between the IoT gateway platform and the IoT communication system, measures and evaluates the performance in terms of latency and throughput using the implemented scenario, and at last draws the conclusion.
155

Virtualization of Data Centers : Case Study on Server Virtualizationn

Kappagantula, Sri Kasyap January 2018 (has links)
Nowadays, Data Centers use Virtualization, as a technique to make use of the opportunity for extension of independent virtual resources from the available physical hardware. Virtualization technique is implemented in the Data Centers to maximize the utilization of physical hardware (which significantly reduces the energy consumption and operating costs) without affecting the “Quality of Service (QoS)”. The main objective of this thesis is to study, the different network topologies used in the Data Center architecture, and to compare the QoS parameters of the virtual server over the physical server and also to abstract the better technology exists for virtualization. The research methodology used in this thesis is “qualitative”. To measure the QoS, we take the Latency, Packet loss, Throughput aspects of virtual servers under different virtualization technologies (KVM, ESXi, Hyper-V, Fusion, and Virtualbox) and compare their performance over the physical server. The work also investigates the CPU, RAM Utilizations and compare the physical and virtual server's behavior under different load conditions. The Results shows us that the virtual servers have performed better in terms of resource utilization, Latency and response times when compared to the physical servers. But there are some factors like backup and recovery, VM Sprawl, capacity planning, building a private cloud, which should be addressed for the growth of virtual data centers. Parameters that affect the performance of the virtual servers are addressed and the trade-off between the virtual and physical servers is established in terms of QoS aspects. The overall performance of virtual servers is effective when compared to the performance of physical servers.
156

Characterization of the naïve kappa light chain murine immunoglobulin repertoire in spaceflight

Ward, Claire January 1900 (has links)
Master of Science / Department of Biology / Stephen K. Chapes / Immunoglobulins are receptors expressed on the outside of a B cell that can specifically bind pathogens and toxic substances within a host. These receptors are heterodimers of two chains: heavy and light, which are encoded at separate loci. Enzymatic splicing of gene segments at heavy and light chain loci within the genomic DNA in every B cell results in a highly diversified and specific repertoire of immunoglobulins in a single host. Spaceflight is known to affect reduce splenic B cell populations and B cell progenitors within the bone marrow, potentially restricting the diversity of the immunoglobulin repertoire (Ig-Rep). The objective of this thesis project was to characterize the impact of spaceflight on the kappa light-chain Ig-Rep of the C57BL/6 mouse. High-throughput sequencing (HTS) technologies have enabled the rapid characterization of Ig-Reps, however, standard Ig-Rep workflows often rely the amplification of immunoglobulin sequences to ensure the capture immunoglobulin sequences from rare B cell clones. Additionally, the Ig-Rep is often assessed in sorted B cell populations. Opportunities for spaceflight experiments are limited and costly, and the exclusive amplification of immunoglobulin sequences prior to HTS results in a dataset that cannot be mined for additional information. Furthermore, due to the difficulties of tissue collection in spaceflight, HTS of sorted B cell populations is not feasible. We optimized a protocol in which the Ig-Rep was assessed from unamplified whole tissue immunoglobulin transcripts. The Ig-Rep was characterized by gene segment usage, gene segment combinations and the region in which gene segments are joined. HTS datasets of ground control animals and animals flown aboard the International Space Station were compared to explore the impact of spaceflight on the unimmunized murine Ig-Rep.
157

Development and application of a rapid micro-scale method of lignin content determination in Arabidopsis thaliana accessions

Chang, Xue Feng 05 1900 (has links)
Lignin is a major chemical component of plants and the second most abundant natural polymer after cellulose. The concerns and interests of agriculture and industry have stimulated the study of genes governing lignin content in plants in an effort to adapt plants to human purposes. Arabidopsis thaliana provides a convenient model for the study of the genes governing lignin content because of its short growth cycle, small plant size, and small completely sequenced genome. In order to identify the genes controlling lignin content in Arabidopsis accessions using Quantitative Trait Locus (QTL) analysis, a rapid micro-scale method of lignin determination is required. The acetyl bromide method has been modified to enable the rapid micro-scale determination of lignin content in Arabidopsis. Modifications included the use of a micro-ball mill, adoption of a modified rapid method of extraction, use of an ice-bath to stabilize solutions and reduction in solution volumes. The modified method was shown to be accurate and precise with values in agreement with those determined by the conventional method. The extinction coefficient for Arabidopsis lignin, dissolved using acetyl bromide, was determined to be 23.35 g-iLcm-1. This value is independent of the Arabidopsis accession, environmental growth conditions and is insensitive to syringyl/guaiacyl ratio. The modified acetyl bromide method was shown to be well correlated with the 72% sulfuric acid method once the latter had been corrected for protein contamination and acid-soluble lignin content (R² = 0.988, P < 0.0001). As determined by the newly developed acetyl bromide method and confirmed by the sulfuric acid method, lignin content in Arabidopsis was found to be a divergent property. Lignin content in Arabidopsis was found to be weekly correlated with growth rate among Arabidopsis accessions (R² = 0.48, P = 0.011). Lignin content was also found to be correlated with plant height among Arabidopsis accessions (R² = 0.491, P < 0.0001). / Forestry, Faculty of / Graduate
158

Sifter-T: Um framework escalável para anotação filogenômica probabilística funcional de domínios protéicos / Sifter-T: A scalable framework for phylogenomic probabilistic protein domain functional annotation

Danillo Cunha de Almeida e Silva 25 October 2013 (has links)
É conhecido que muitos softwares deixam de ser utilizados por sua complexa usabilidade. Mesmo ferramentas conhecidas por sua qualidade na execução de uma tarefa são abandonadas em favor de ferramentas mais simples de usar, de instalar ou mais rápidas. Na área da anotação funcional a ferramenta Sifter (v2.0) é considerada uma das com melhor qualidade de anotação. Recentemente ela foi considerada uma das melhores ferramentas de anotação funcional segundo o Critical Assessment of protein Function Annotation (CAFA) experiment. Apesar disso, ela ainda não é amplamente utilizada, provavelmente por questões de usabilidade e adequação do framework à larga escala. O workflow SIFTER original consiste em duas etapas principais: A recuperação das anotações para uma lista de genes e a geração de uma árvore de genes reconciliada para a mesma lista. Em seguida, a partir da árvore de genes o Sifter constrói uma rede bayesiana de mesma estrutura nas quais as folhas representam os genes. As anotações funcionais dos genes conhecidos são associadas a estas folhas e em seguida as anotações são propagadas probabilisticamente ao longo da rede bayesiana até as folhas sem informação a priori. Ao fim do processo é gerada para cada gene de função desconhecida uma lista de funções putativas do tipo Gene Ontology e suas probabilidades de ocorrência. O principal objetivo deste trabalho é aperfeiçoar o código-fonte original para melhor desempenho, potencialmente permitindo que seja usado em escala genômica. Durante o estudo do workflow de pré-processamento dos dados encontramos oportunidades para aperfeiçoamento e visualizamos estratégias para abordá-las. Dentre as estratégias implementadas temos: O uso de threads paralelas; balanceamento de carga de processamento; algoritmos revisados para melhor aproveitamento de disco, memória e tempo de execução; adequação do código fonte ao uso de bancos de dados biológicos em formato utilizado atualmente; aumento da acessibilidade do usuário; expansão dos tipos de entrada aceitos; automatização do processo de reconciliação entre árvores de genes e espécies; processos de filtragem de seqüências para redução da dimensão da análise; e outras implementações menores. Com isto conquistamos aumento de performance de até 87 vezes para a recuperação de anotações e 73,3% para a reconstrução da árvore de genes em máquinas quad-core, e redução significante de consumo de memória na fase de realinhamento. O resultado desta implementação é apresentado como Sifter-T (Sifter otimizado para Throughput), uma ferramenta open source de melhor usabilidade, velocidade e qualidade de anotação em relação à implementação original do workflow de Sifter. Sifter-T foi escrito de forma modular em linguagem de programação Python; foi elaborado para simplificar a tarefa de anotação de genomas e proteomas completos; e os resultados são apresentados de forma a facilitar o trabalho do pesquisador. / It is known that many software are no longer used due to their complex usability. Even tools known for their task execution quality are abandoned in favour of faster tools, simpler to use or install. In the functional annotation field, Sifter (v2.0) is regarded as one of the best when it comes to annotation quality. Recently it has been considered one of the best tools for functional annotation according to the \"Critical Assessment of Protein Function Annotation (CAFA) experiment. Nevertheless, it is still not widely used, probably due to issues with usability and suitability of the framework to a high throughput scale. The original workflow SIFTER consists of two main steps: The annotation recovery for a list of genes and the reconciled gene tree generation for the same list. Next, based on the gene tree, Sifter builds a Bayesian network structure in which its leaves represent genes. The known functional annotations are associated to the aforementioned leaves, and then the annotations are probabilistically propagated along the Bayesian network to the leaves without a priori information. At the end of the process, a list of Gene Ontology functions and their occurrence probabilities is generated for each unknown function gene. This work main goal is to optimize the original source code for better performance, potentially allowing it to be used in a genome-wide scale. Studying the pre-processing workflow we found opportunities for improvement and envisioned strategies to address them. Among the implemented strategies we have: The use of parallel threads; CPU load balancing, revised algorithms for best utilization of disk access, memory usage and runtime; source code adaptation to currently used biological databases; improved user accessibility; input types increase; automatic gene and species tree reconciliation process; sequence filtering to reduce analysis dimension, and other minor implementations. With these implementations we achieved great performance speed-ups. For example, we obtained 87-fold performance increase in the annotation recovering module and 72.3% speed increase in the gene tree generation module using quad-core machines. Additionally, significant memory usage decrease during the realignment phase was obtained. This implementation is presented as Sifter-T (Sifter Throughput-optimized), an open source tool with better usability, performance and annotation quality when compared to the Sifter\'s original workflow implementation. Sifter-T was written in a modular fashion using Python programming language; it is designed to simplify complete genomes and proteomes annotation tasks and the outputs are presented in order to make the researcher\'s work easier.
159

HIGH THROUGHPUT EXPERIMENTATION WITH DESORPTION ELECTROSPRAY IONIZATION MASS SPECTROMETRY TO GUIDE CONTINUOUS-FLOW SYNTHESIS

Harrison S Ewan (7900775) 21 November 2019 (has links)
<div>The present work seeks to use high throughput experimentation (HTE) to guide chemical synthesis. We demonstrate the use of an HTE system utilizing a robotic liquid handler to prepare arrays of reactions and print them onto a surface to be analyzed by desorption electrospray ionization mass spectrometry (DESI-MS) as a tool to guide reaction optimization, synthetic route selection, and reaction discovery. DESI-MS was employed as a high throughput experimentation tool to provide qualitative predictions of the outcome of a reaction, so that vast regions of chemical reactivity space may be more rapidly explored and areas of optimal efficiency identified. This work is part of a larger effort to accelerate reaction optimization to enable the rapid development of continuous-flow syntheses of small molecules in high yield. In the present iteration of this system, reactions are scaled up from these nanogram surface printed reactions to milligram scale microfluidic reactions, where more detailed analysis and further optimization may be performed. In the earliest iterations of this screening system prior to the use of DESI, the initial screening reactions were performed in electrospray (ESI) droplets and leidenfrost droplets before scaling up to microfluidic reactions which were analyzed by ESI-MS. The insights from this combined droplet and microfluidic screening/rapid ESI-MS analysis approach, helped guide the synthesis of diazepam. The system was further refined to by the use of liquid handling robots and DESI-MS analysis, greatly accelerating the overall pace of screening. In order to build confidence in this approach, however, it is necessary to establish a robust predictive connection between reactions performed under analogous DESI-MS, batch, and microfluidic reaction conditions. To achieve this goal, we first explored the potential of high throughput DESI-MS experiments to identify trends in reactivity based on chemical structure, solvent, temperature, and stoichiometry that are consistent across these platforms. While DESI-MS narrowed the scope of possibilities for reaction</div><div>13</div><div>selection with some parameters such as solvent, others like stoichiometry and temperature still required further optimization under continuous synthesis conditions. With our increased confidence in DESI-MS HTE, we proceeded to explore it’s application to rapidly evaluate large sets of aldol reactions of triacetic acid lactone (TAL), a compound well studied for use as a bio-based platform molecule that may be converted to a range of useful commodity chemicals, agrochemicals, and advanced pharmaceutical intermediates. Our DESI-MS HTE screening technique was used to rapidly evaluate known reactions of triacetic acid lactone, in an effort to accelerate reaction discovery with platform chemicals. Our rapid experimentation system, when applied to reaction discovery in this manner, may help to shorten the time scale of platform chemical development.</div>
160

Dissecting the genetic architecture of salt tolerance in the wild tomato Solanum pimpinellifolium

Morton, Mitchell 10 1900 (has links)
Salt stress severely constrains plant performance and global agricultural productivity. 5% of arable land, 20% of irrigated areas and 98% of water reserves worldwide are saline. Improving the salt tolerance of major crop species could help attenuate yield losses and expand irrigation opportunities and provide in situ relief in areas where poverty, food and water scarcity are prevalent. Increasing the salt tolerance of crops with high commercial and nutritional value, such as tomato (Solanum lycopersicum L.), would provide particularly significant economic and health benefits. However, salt tolerance is a complex trait with a limited genetic repertoire in domesticated crop varieties, including tomato, frustrating attempts to breed and engineer tolerant crop varieties. Here, a genome-wide association study (GWAS) was undertaken, leveraging the rich genetic diversity of the wild, salt tolerant tomato Solanum pimpinellifolium and the latest phenotyping technologies to identify traits that contribute to salt tolerance and the genetic basis for variation in those traits. A panel of 220 S. pimpinellifolium accessions was phenotyped, focusing on image-based high-throughput phenotyping over time in controlled and field conditions in young and mature plants. Results reveal substantial natural variation in salt tolerance over time across many traits. In particular, the use of unmanned aerial vehicle (UAV)-based remote sensing in the field allowed high-resolution RGB, thermal and hyperspectral mapping that offers new insights into plant performance in the field, over time. To empower our GWAS and facilitate the identification of candidate genes, a new S. pimpinellifolium reference genome was generated, 811Mb in size, N50 of ~76kb, containing 25,970 annotated genes. Analysis of this reference genome highlighted potential contributors to salt tolerance, including enrichments in genes with stress response functions and a high copy number of the salt tolerance-associated gene inositol- 3-phosphate synthase (I3PS). A recently completed full genome re-sequencing of the panel, along with a newly available pseudomolecule-level assembly of the S. pimpinellifolium genome with N50 of ~11Mb, will serve to drive a GWAS to identify loci associated with traits that contribute to salt tolerance. Further research including gene validation, breeding, genetic modification and gene editing experiments will drive the development of new salt tolerant tomato cultivars.

Page generated in 0.0457 seconds