151 |
Aplicação de triagem de alto desempenho na investigação das atividades enzimaticas e enantiosseletividades de microorganismos brasileiros / Enzymatic activities and Quick E in hydrolases screening appyng fluorescent probesMantovani, Simone Moraes 27 February 2007 (has links)
Orientador: Anita Jocelyne Marsaioli / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Quimica / Made available in DSpace on 2018-08-10T04:53:28Z (GMT). No. of bitstreams: 1
Mantovani_SimoneMoraes_M.pdf: 1495773 bytes, checksum: 493a0576675a8aad3c2879147aa7745f (MD5)
Previous issue date: 2007 / Resumo: Nas últimas décadas as reações utilizando biocatalisadores tem sido amplamante aplicadas na síntese orgânica, como componentes chave de muitos processos químicos industriais, levando ao aumento na demanda por novas enzimas. A maneira mais rápida e simples de detectar enzimas é através de metodologias de triagem de alto desempenho (HTS) que permitam identificação rápida da atividade enzimática, como por exemplo, os ensaios utilizando compostos fluorogênicos e cromogênicos. Nesse trabalho nós aplicamos HTS baseado em substratos fluorogênicos para detecção de epóxido-hidrolases e esterases em microrganismos brasileiros. Inicialmente foram selecionados cinco microrganismos com alta atividade epóxido-hidrolase, e 18 pela a presença de esterases. Inspirados nesse principio nós adaptamos a metodolgia conhecida como "Quick E" para a avaliação rápida das enantiosseletividades de epóxido-hidrolases em células íntegras através de medidas das velocidades iniciais de substratos fluorogênicos quirais avaliados separadamente com adição de um competidor. Os ensaios de enantiosseletividade mostraram que os experimentos com competidor apresentaram valores de enatiosseletividade muito próximos dos valores de E determinados via biocatálise convencional. Além disso, alguns microrganismos selecionados por HTS foram testados para reações de biotransformação frente a substratos de interesse sintético, o que permitiu, além da confirmação das atividades enzimáticas e seletividades observadas, detectar a capacidade do microrganismo C. albícans CCT 0776 de desracemizar álcoois secundários por estereoinversão, fornecendo o (S)-1,2-octanodiol com 100 % de rendimento teórico e ee > 99 %, e o (S)-4-fenilmetoxi-1,2-butanodiol com ee 45 % / Abstract: Since the past decades the biocatalysts have been applied in organic chemistry, as key components of many industrial chemical processes, thus increasing the demand for novel enzymes. High-Throughput Screening (HTS), using fluorogenic probes are among the best assays to discovery new enzymes, easily adapted to whole cells format. In this work, have been applied fluorogenic probes to screen epoxide hydrolases and esterases in Brazilian Collection Cultures of microorganisms, which allowed to detect epoxide hydrolases in five microorganisms, and esterases in 18 microrganisms. Additionally, were used chiral probes to implement a Quick E assay, for a fast valuation of epoxide hydrolases enantioselectivity by measuring initial rates of pure enantiomers. Optimization of the methodology revealed that almost true E were obtained by competitive experiments of each enantiomer and a substrate of similar reactivity. The quick E assay was validated by determining conversion and ee using GC/MS and NMR (using mandelic acid derivatives) and is now a new method to determine the enantiomeric ratio for epoxide hidrolases. Finally, the outstanding HTS results were better investigated by conventional catalysis detecting a stereoinversion process performed by C. albicans CCT 0776, which furnished (S)-1 ,2-octanodiol in 100 % theoretical yield and ee of 100%, and (S)-4-fenilmetoxi-1,2-butanodiol in ee of 45% / Mestrado / Quimica Organica / Mestre em Química
|
152 |
Análise do problema do Protocolo MAC IEEE 802.11 em redes Ad Hoc MultihopALMEIDA, Adalton de Sena January 2003 (has links)
Made available in DSpace on 2014-06-12T15:58:29Z (GMT). No. of bitstreams: 2
arquivo4635_1.pdf: 928763 bytes, checksum: f0830b47cdbb5936c9f93e74336e98d4 (MD5)
license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5)
Previous issue date: 2003 / O protocolo MAC IEEE 802.11 DFWMAC (Distributed Foundation Wireless Access
Control) foi padronizado para uso em redes locais sem fio e tem sido utilizado para
testar e simular redes locais sem fio ad hoc multihop. Este protocolo tem
apresentado problemas quando trabalhamos com redes ad hoc multihop. Este
problema fica evidente quando submetemos tráfego TCP (Transmission Control
Protocol) entre duas estações. Por tratar-se de um protocolo de controle de acesso
ao meio distribuído, não possuindo um controle central, a decisão de transmissão é
feita pelas próprias estações de acordo com o funcionamento do DFWMAC. Ainda
pelas suas características de funcionamento distribuído, problemas de terminal
escondido e terminal exposto podem ocorrer comprometendo de maneira
significativa o tráfego de conexões TCP. Associado aos problemas de terminal
escondido e terminal exposto , o algoritmo de Backoff Exponencial Binário
(BEB) contribui para que este protocolo não funcione bem em redes ad hoc
multihop.
O resultado da ação de todos estes problemas é a degradação do throughput do
TCP gerando instabilidade e injustiça no acesso ao meio compartilhado. A
instabilidade fica evidente quando a variação do throughput é muito alta em
intervalos de tempo muito curtos. Isto pode ser visto com apenas uma conexão
TCP entre duas estações. Já o problema de injustiça aparece quando submetemos
duas conexões TCP simultâneas, sendo que uma consegue transmitir pacotes de
dados a uma taxa alta, utilizando toda a largura de banda, enquanto a outra
conexão não consegue transmitir nenhum pacote permanecendo com o throughput
zero durante o tempo em que as duas conexões estão ativas.
Este trabalho propõe uma solução para lidar com estes problemas
|
153 |
Sustainable Throughput – QoE PerspectiveDarisipudi, Veeravenkata Naga S Maniteja January 2017 (has links)
In recent years, there has been a significant increase in the demand for streaming of high quality videos on the smart mobile phones. In order to meet the user quality requirements, it is important to maintain the end user quality while taking the resource consumption into consideration. This demand caught the attention of the research communities and network providers to prioritize Quality of Experience (QoE) in addition to the Quality of Service (QoS). In order to meet the users’ expectations, the QoE studies have gained utmost importance, thus creating the challenge of evaluating it in such a way that the quality, cost and energy consumption are taken into account. This gave way to the concept of QoE-aware sustainable throughput, which denotes the maximal throughput at which QoE problems can be still kept at a desired level. The aim of the thesis is to determine the sustainable throughput values from the QoE perspective. The values are observed for different delay and packet loss values in wireless and mobile scenarios. The evaluation is done using the subjective video quality assessment method. In the subjective assessment method, the evaluation is done using the ITU-T recommended Absolute Category Rating (ACR). The video quality ratings are taken from the users, and are then averaged to obtain the Mean Opinion Score (MOS). The obtained scores are used for analysis in determining the sustainable throughput values from the users’ perspective. From the results it is determined that, for all the video test cases, the videos are rated better quality at low packet loss values and low delay values. The quality of the videos with the presence of delay is rated high compared to the video quality in the case of packet loss. It was observed that the high resolution videos are feeble in the presence of higher disturbances i.e. high packet loss and larger delays. From considering all the cases, it can be observed that the QoE disturbances due to the delivery issues is at an acceptable minimum for the 360px video. Hence, the 480x360 video is the threshold to sustain the video quality.
|
154 |
Performance evaluation of Cassandra in AWS environment : An experimentSUBBA REDDY GARI, AVINASH KUMAR REDDY January 2017 (has links)
Context. In the field of computer science, the concept of cloud computing plays a prominent role which can be hosted on the internet to store, manage and also to process the data. Cloud platforms enables the users to perform large number of computing tasks across the remote servers. There exist several cloud platform providers like Amazon, Microsoft, Google, Oracle and IBM. Several conventional databases are available in cloud service providers in order to handle the data. Cassandra is a NoSQL database system which can handle the unstructured data and can scale large number of operations per second even across multiple datacentres. Objectives. In this study, the performance evaluation of NoSQL database in AWS cloud service provider has been performed. The performance evaluation of a three node Cassandra cluster is performed for different configuration of EC2 instances. This performance has been evaluated using the metrics throughput and CPU utilization. The main aim of this thesis was to evaluate the performance of Cassandra under various configurations with the YCSB benchmarking tool. Methods. A literature review has been conducted to gain more knowledge about the current research area. The metrics required to evaluate the performance of Cassandra were identified through literature study. The experiment was conducted to compute the results for throughput and CPU utilization under the different configurations t2.micro, t2.medium and t2.small for 3 node and 6 node cluster using YCSB benchmarking tool. Results. The results of the experiment include the metrics, throughput and CPU utilization which were identified in the literature review. The results calculated were plotted as graphs to compare their performance for three different configurations. The results obtained were segregated as two different scenarios which were for 3 node and 6 node clusters. Conclusions. Based on the obtained values of throughput the optimal or sub-optimal configuration of a data centre running multiple instances of Cassandra such that the specific throughput requirements are satisfied.
|
155 |
Integration and Evaluation of IoT Hardware and Software platformsXu, Ting January 2017 (has links)
The Internet of Things (IoT) is growing rapidly these years and the influence of IoT on everyday life and behavior are also increasing. It is a network that connects physical devices, vehicles, buildings, and other items and embedded with electronics, software, sensors, actuators, and network connectivity so that these objects can collect and exchange data. It has been utilized in lots of domains, such as transportation and logistics domain, healthcare domain, smart environment domain, personal and social domain. It is estimated that the IoT will consist of almost 50 billion objects by 2020. IoT Gateway is really important in IoT, which can bridge traditional communication networks with sensor networks to make the network communication easier. IoT communication is of vital importance in today’s life. This study has the aim of integrating and evaluating of IoT gateways and IoT communication systems. It proposes a scenario where the IoT gateway connects to an actuator in order to control the actuator and transmit the data via the IoT communication system, creates a demonstrator by setting up the communication between the IoT gateway platform and the IoT communication system, measures and evaluates the performance in terms of latency and throughput using the implemented scenario, and at last draws the conclusion.
|
156 |
Virtualization of Data Centers : Case Study on Server VirtualizationnKappagantula, Sri Kasyap January 2018 (has links)
Nowadays, Data Centers use Virtualization, as a technique to make use of the opportunity for extension of independent virtual resources from the available physical hardware. Virtualization technique is implemented in the Data Centers to maximize the utilization of physical hardware (which significantly reduces the energy consumption and operating costs) without affecting the “Quality of Service (QoS)”. The main objective of this thesis is to study, the different network topologies used in the Data Center architecture, and to compare the QoS parameters of the virtual server over the physical server and also to abstract the better technology exists for virtualization. The research methodology used in this thesis is “qualitative”. To measure the QoS, we take the Latency, Packet loss, Throughput aspects of virtual servers under different virtualization technologies (KVM, ESXi, Hyper-V, Fusion, and Virtualbox) and compare their performance over the physical server. The work also investigates the CPU, RAM Utilizations and compare the physical and virtual server's behavior under different load conditions. The Results shows us that the virtual servers have performed better in terms of resource utilization, Latency and response times when compared to the physical servers. But there are some factors like backup and recovery, VM Sprawl, capacity planning, building a private cloud, which should be addressed for the growth of virtual data centers. Parameters that affect the performance of the virtual servers are addressed and the trade-off between the virtual and physical servers is established in terms of QoS aspects. The overall performance of virtual servers is effective when compared to the performance of physical servers.
|
157 |
Characterization of the naïve kappa light chain murine immunoglobulin repertoire in spaceflightWard, Claire January 1900 (has links)
Master of Science / Department of Biology / Stephen K. Chapes / Immunoglobulins are receptors expressed on the outside of a B cell that can specifically bind pathogens and toxic substances within a host. These receptors are heterodimers of two chains: heavy and light, which are encoded at separate loci. Enzymatic splicing of gene segments at heavy and light chain loci within the genomic DNA in every B cell results in a highly diversified and specific repertoire of immunoglobulins in a single host. Spaceflight is known to affect reduce splenic B cell populations and B cell progenitors within the bone marrow, potentially restricting the diversity of the immunoglobulin repertoire (Ig-Rep).
The objective of this thesis project was to characterize the impact of spaceflight on the kappa light-chain Ig-Rep of the C57BL/6 mouse. High-throughput sequencing (HTS) technologies have enabled the rapid characterization of Ig-Reps, however, standard Ig-Rep workflows often rely the amplification of immunoglobulin sequences to ensure the capture immunoglobulin sequences from rare B cell clones. Additionally, the Ig-Rep is often assessed in sorted B cell populations.
Opportunities for spaceflight experiments are limited and costly, and the exclusive amplification of immunoglobulin sequences prior to HTS results in a dataset that cannot be mined for additional information. Furthermore, due to the difficulties of tissue collection in spaceflight, HTS of sorted B cell populations is not feasible. We optimized a protocol in which the Ig-Rep was assessed from unamplified whole tissue immunoglobulin transcripts. The Ig-Rep was characterized by gene segment usage, gene segment combinations and the region in which gene segments are joined. HTS datasets of ground control animals and animals flown aboard the International Space Station were compared to explore the impact of spaceflight on the unimmunized murine Ig-Rep.
|
158 |
Development and application of a rapid micro-scale method of lignin content determination in Arabidopsis thaliana accessionsChang, Xue Feng 05 1900 (has links)
Lignin is a major chemical component of plants and the second most abundant natural polymer after cellulose. The concerns and interests of agriculture and industry have stimulated the study of genes governing lignin content in plants in an effort to adapt plants to human purposes. Arabidopsis thaliana provides a convenient model for the study of the genes governing lignin content because of its short growth cycle, small plant size, and small completely sequenced genome. In order to identify the genes controlling lignin content in Arabidopsis accessions using Quantitative Trait Locus (QTL) analysis, a rapid micro-scale method of lignin determination is required.
The acetyl bromide method has been modified to enable the rapid micro-scale determination of lignin content in Arabidopsis. Modifications included the use of a micro-ball mill, adoption of a modified rapid method of extraction, use of an ice-bath to stabilize solutions and reduction in solution volumes. The modified method was shown to be accurate and precise with values in agreement with those determined by the conventional method. The extinction coefficient for Arabidopsis lignin, dissolved using acetyl bromide, was determined to be 23.35 g-iLcm-1. This value is independent of the Arabidopsis accession, environmental growth conditions and is insensitive to syringyl/guaiacyl ratio. The modified acetyl bromide method was shown to be well correlated with the 72% sulfuric acid method once the latter had been corrected for protein contamination and acid-soluble lignin content (R² = 0.988, P < 0.0001).
As determined by the newly developed acetyl bromide method and confirmed by the sulfuric acid method, lignin content in Arabidopsis was found to be a divergent property. Lignin content in Arabidopsis was found to be weekly correlated with growth rate among Arabidopsis accessions (R² = 0.48, P = 0.011). Lignin content was also found to be correlated with plant height among Arabidopsis accessions (R² = 0.491, P < 0.0001). / Forestry, Faculty of / Graduate
|
159 |
Sifter-T: Um framework escalável para anotação filogenômica probabilística funcional de domínios protéicos / Sifter-T: A scalable framework for phylogenomic probabilistic protein domain functional annotationDanillo Cunha de Almeida e Silva 25 October 2013 (has links)
É conhecido que muitos softwares deixam de ser utilizados por sua complexa usabilidade. Mesmo ferramentas conhecidas por sua qualidade na execução de uma tarefa são abandonadas em favor de ferramentas mais simples de usar, de instalar ou mais rápidas. Na área da anotação funcional a ferramenta Sifter (v2.0) é considerada uma das com melhor qualidade de anotação. Recentemente ela foi considerada uma das melhores ferramentas de anotação funcional segundo o Critical Assessment of protein Function Annotation (CAFA) experiment. Apesar disso, ela ainda não é amplamente utilizada, provavelmente por questões de usabilidade e adequação do framework à larga escala. O workflow SIFTER original consiste em duas etapas principais: A recuperação das anotações para uma lista de genes e a geração de uma árvore de genes reconciliada para a mesma lista. Em seguida, a partir da árvore de genes o Sifter constrói uma rede bayesiana de mesma estrutura nas quais as folhas representam os genes. As anotações funcionais dos genes conhecidos são associadas a estas folhas e em seguida as anotações são propagadas probabilisticamente ao longo da rede bayesiana até as folhas sem informação a priori. Ao fim do processo é gerada para cada gene de função desconhecida uma lista de funções putativas do tipo Gene Ontology e suas probabilidades de ocorrência. O principal objetivo deste trabalho é aperfeiçoar o código-fonte original para melhor desempenho, potencialmente permitindo que seja usado em escala genômica. Durante o estudo do workflow de pré-processamento dos dados encontramos oportunidades para aperfeiçoamento e visualizamos estratégias para abordá-las. Dentre as estratégias implementadas temos: O uso de threads paralelas; balanceamento de carga de processamento; algoritmos revisados para melhor aproveitamento de disco, memória e tempo de execução; adequação do código fonte ao uso de bancos de dados biológicos em formato utilizado atualmente; aumento da acessibilidade do usuário; expansão dos tipos de entrada aceitos; automatização do processo de reconciliação entre árvores de genes e espécies; processos de filtragem de seqüências para redução da dimensão da análise; e outras implementações menores. Com isto conquistamos aumento de performance de até 87 vezes para a recuperação de anotações e 73,3% para a reconstrução da árvore de genes em máquinas quad-core, e redução significante de consumo de memória na fase de realinhamento. O resultado desta implementação é apresentado como Sifter-T (Sifter otimizado para Throughput), uma ferramenta open source de melhor usabilidade, velocidade e qualidade de anotação em relação à implementação original do workflow de Sifter. Sifter-T foi escrito de forma modular em linguagem de programação Python; foi elaborado para simplificar a tarefa de anotação de genomas e proteomas completos; e os resultados são apresentados de forma a facilitar o trabalho do pesquisador. / It is known that many software are no longer used due to their complex usability. Even tools known for their task execution quality are abandoned in favour of faster tools, simpler to use or install. In the functional annotation field, Sifter (v2.0) is regarded as one of the best when it comes to annotation quality. Recently it has been considered one of the best tools for functional annotation according to the \"Critical Assessment of Protein Function Annotation (CAFA) experiment. Nevertheless, it is still not widely used, probably due to issues with usability and suitability of the framework to a high throughput scale. The original workflow SIFTER consists of two main steps: The annotation recovery for a list of genes and the reconciled gene tree generation for the same list. Next, based on the gene tree, Sifter builds a Bayesian network structure in which its leaves represent genes. The known functional annotations are associated to the aforementioned leaves, and then the annotations are probabilistically propagated along the Bayesian network to the leaves without a priori information. At the end of the process, a list of Gene Ontology functions and their occurrence probabilities is generated for each unknown function gene. This work main goal is to optimize the original source code for better performance, potentially allowing it to be used in a genome-wide scale. Studying the pre-processing workflow we found opportunities for improvement and envisioned strategies to address them. Among the implemented strategies we have: The use of parallel threads; CPU load balancing, revised algorithms for best utilization of disk access, memory usage and runtime; source code adaptation to currently used biological databases; improved user accessibility; input types increase; automatic gene and species tree reconciliation process; sequence filtering to reduce analysis dimension, and other minor implementations. With these implementations we achieved great performance speed-ups. For example, we obtained 87-fold performance increase in the annotation recovering module and 72.3% speed increase in the gene tree generation module using quad-core machines. Additionally, significant memory usage decrease during the realignment phase was obtained. This implementation is presented as Sifter-T (Sifter Throughput-optimized), an open source tool with better usability, performance and annotation quality when compared to the Sifter\'s original workflow implementation. Sifter-T was written in a modular fashion using Python programming language; it is designed to simplify complete genomes and proteomes annotation tasks and the outputs are presented in order to make the researcher\'s work easier.
|
160 |
HIGH THROUGHPUT EXPERIMENTATION WITH DESORPTION ELECTROSPRAY IONIZATION MASS SPECTROMETRY TO GUIDE CONTINUOUS-FLOW SYNTHESISHarrison S Ewan (7900775) 21 November 2019 (has links)
<div>The present work seeks to use high throughput experimentation (HTE) to guide chemical synthesis. We demonstrate the use of an HTE system utilizing a robotic liquid handler to prepare arrays of reactions and print them onto a surface to be analyzed by desorption electrospray ionization mass spectrometry (DESI-MS) as a tool to guide reaction optimization, synthetic route selection, and reaction discovery. DESI-MS was employed as a high throughput experimentation tool to provide qualitative predictions of the outcome of a reaction, so that vast regions of chemical reactivity space may be more rapidly explored and areas of optimal efficiency identified. This work is part of a larger effort to accelerate reaction optimization to enable the rapid development of continuous-flow syntheses of small molecules in high yield. In the present iteration of this system, reactions are scaled up from these nanogram surface printed reactions to milligram scale microfluidic reactions, where more detailed analysis and further optimization may be performed. In the earliest iterations of this screening system prior to the use of DESI, the initial screening reactions were performed in electrospray (ESI) droplets and leidenfrost droplets before scaling up to microfluidic reactions which were analyzed by ESI-MS. The insights from this combined droplet and microfluidic screening/rapid ESI-MS analysis approach, helped guide the synthesis of diazepam. The system was further refined to by the use of liquid handling robots and DESI-MS analysis, greatly accelerating the overall pace of screening. In order to build confidence in this approach, however, it is necessary to establish a robust predictive connection between reactions performed under analogous DESI-MS, batch, and microfluidic reaction conditions. To achieve this goal, we first explored the potential of high throughput DESI-MS experiments to identify trends in reactivity based on chemical structure, solvent, temperature, and stoichiometry that are consistent across these platforms. While DESI-MS narrowed the scope of possibilities for reaction</div><div>13</div><div>selection with some parameters such as solvent, others like stoichiometry and temperature still required further optimization under continuous synthesis conditions. With our increased confidence in DESI-MS HTE, we proceeded to explore it’s application to rapidly evaluate large sets of aldol reactions of triacetic acid lactone (TAL), a compound well studied for use as a bio-based platform molecule that may be converted to a range of useful commodity chemicals, agrochemicals, and advanced pharmaceutical intermediates. Our DESI-MS HTE screening technique was used to rapidly evaluate known reactions of triacetic acid lactone, in an effort to accelerate reaction discovery with platform chemicals. Our rapid experimentation system, when applied to reaction discovery in this manner, may help to shorten the time scale of platform chemical development.</div>
|
Page generated in 0.0504 seconds