• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 490
  • 92
  • 71
  • 61
  • 36
  • 21
  • 18
  • 18
  • 13
  • 6
  • 5
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 1011
  • 677
  • 257
  • 180
  • 130
  • 125
  • 117
  • 96
  • 81
  • 80
  • 79
  • 77
  • 66
  • 63
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Sustainable Throughput Measurements for Video Streaming

Nutalapati, Hima Bindu January 2017 (has links)
With the increase in demand for video streaming services on the hand held mobile terminals with limited battery life, it is important to maintain the user Quality of Experience (QoE) while taking the resource consumption into consideration. Hence, the goal is to offer as good quality as feasible, avoiding as much user-annoyance as possible. Hence, it is essential to deliver the video, avoiding any uncontrollable quality distortions. This can be possible when an optimal (or desirable) throughput value is chosen such that exceeding the particular threshold results in entering a region of unstable QoE, which is not feasible. Hence, the concept of QoE-aware sustainable throughput is introduced as the maximal value of the desirable throughput that avoids disturbances in the Quality of Experience (QoE) due to delivery issues, or keeps them at an acceptable minimum. The thesis aims at measuring the sustainable throughput values when video streams of different resolutions are streamed from the server to a mobile client over wireless links, in the presence of network disturbances packet loss and delay. The video streams are collected at the client side for quality assessment and the maximal throughput at which the QoE problems can still be kept at a desired level is determined. Scatter plots were generated for the individual opinion scores and their corresponding throughput values for the disturbance case and regression analysis is performed to find the best fit for the observed data. Logarithmic, exponential, linear and power regressions were considered in this thesis. The R-squared values are calculated for each regression model and the model with R-squared value closest to 1 is determined to be the best fit. Power regression model and logarithmic model have the R-squared values closest to 1.  Better quality ratings have been observed for the low resolution videos in the presence of packet loss and delay for the considered test cases. It can be observed that the QoE disturbances can be kept at a desirable level for the low resolution videos and from the test cases considered for the investigation, 360px video is more resilient in case of high delay and packet loss values and has better opinion score values. Hence, it can be observed that the throughput is sustainable at this threshold.
82

Gas turbine application to CO2 pipeline : a techno-economic and environmental risk analysis

El-Suleiman, Abdussalam January 2014 (has links)
Gas Turbines (GTs) are used extensively in pipelines to compress gas at suitable points. The primary objective of this study is to look at CO2 return pipelines and the close coupling of the compression system with advanced prime mover cycles. Adopting a techno-economic and environmental risk analysis (TERA) frame work, this study conducts the modelling and evaluation of CO2 compression power requirements for gas turbine driven equipment (pump and compressor). The author developed and validated subroutines to implement variable stators in an in-house GT simulation code known as Variflow in order to enhance the off-design performance simulation of the code. This modification was achieved by altering the existing compressor maps and main program algorithm of the code. Economic model based on the net present value (NPV) method, CO2 compressibility factor model based on the Peng-Robinson equation of state and pipeline hydraulic analysis model based on fundamental gas flow equation were also developed to facilitate the TERA of selected GT mechanical drives in two case scenarios. These case scenarios were specifically built around Turbomatch simulated GT design and off-design performance which ensure that the CO2 is introduced into the pipeline at the supercritical pressure as well as sustain the CO2 pressure above a minimum designated pressure during transmission along an adapted real life pipeline profile. The required compression duty for the maximum and minimum CO2 throughput as well as the operation site ambient condition, guided the selection of two GTs of 33.9 MW and 9.4 MW capacities. At the site ambient condition, the off design simulations of these GTs give an output of 25.9 MW and 7.6 MW respectively. Given the assumed economic parameters over a plant life of 25 years, the NPV for deploying the 33.9 MW GT is about £13.9M while that of the 9.4 MW GT is about £1.2M. The corresponding payback periods (PBPs) were 3 and 7 years respectively. Thus, a good return on investment is achieved within reasonable risk. The sensitivity analysis results show a NPV of about £19.1M - £24.3M and about £3.1M - £4.9M for the 33.9 MW and 9.4 MW GTs respectively over a 25 - 50% fuel cost reduction. Their PBPs were 3 - 2 years and 5 - 4 years respectively. In addition, as the CO2 throughput drops, the risk becomes higher with less return on investment. In fact, when the CO2 throughput drops to a certain level, the investment becomes highly unattractive and unable to payback itself within the assumed 25 years plant life. The hydraulic analysis results for three different pipe sizes of 24, 14 and 12¾ inch diameters show an increase in pressure drop with increase in CO2 throughput and a decrease in pressure drop with increase in pipe size for a given throughput. Owing to the effect of elevation difference, the 511 km long pipeline profile gives rise to an equivalent length of 511.52 km. Similarly, given the pipeline inlet pressure of 15 MPa and other assumed pipeline data, the 3.70 MTPY (0.27 mmscfd) maximum average CO2 throughput considered in the 12¾ inch diameter pipeline results in a delivery pressure of about 15.06 MPa. Under this condition, points of pressure spikes above the pipeline maximum operating allowable pressure (15.3 MPa) were obtained along the profile. Lowering the pipeline operating pressure to 10.5 MPa gives a delivery pressure of about 10.45 MPa within safe pressure limits. At this 10.5 MPa, over a flat pipeline profile of same length, the delivery pressure is about 10.4 MPa. Thus, given the operating conditions for the dense phase CO2 pipeline transmission and the limit of this study, it is very unlikely that a booster station will be required. So also, compressing the CO2 to 15 MPa may no longer be necessary; which eliminates the need of combining a compressor and pump for the initial pressure boost in order to save power. This is because, irrespective of the saving in energy, the increase in capital cost associated with obtaining a pump and suitable driver far outweighs the extra expense incurred in acquiring a rated GT mechanical drive to meet the compression duty.
83

Biomass Allocation Variation Under Different Nitrogen and Water Treatments in Wheat

Seth A Tolley (7026389) 16 August 2019 (has links)
<div><p>Wheat is among the most important cereal crops in the world today with respect to the area harvested (219 million ha), production (772 million tonnes), and productivity (3.53 tons/ha). However, global wheat production goals for the coming decades are falling short of needed increases. Among the leading factors hindering yields is abiotic stress which is present in nearly 38% of wheat acres globally. Nevertheless, many standard wheat breeding programs focus on yield and yield related traits (i.e. grain yield, plant height, and test weight) in ideal environments rather than evaluating traits that could lead to enhanced abiotic stress tolerance. In this thesis, we explore the use of root and high-throughput phenotyping strategies to aid in further development of abiotic stress tolerant varieties. </p><p>In the first three experiments, root phenotypes were evaluated in two nitrogen (N) treatments. Over a series of seedling, adult, and multiple-growth-stage destructive plant biomass measurements, above-ground and below-ground traits were analyzed in seven geographically diverse wheat accessions. Root and shoot biomass allocation in fourteen-day-old seedlings were analyzed using paper-roll-supported hydroponic culture in two Hoagland solutions containing 0.5 (low) and 4.0 (high) mM of N. Root traits were digitized using a WINRhizo platform. For biomass analysis at maturity, plants were grown in 7.5-liter pots filled with soil mix using the same concentrations of N. Traits were measured as plants reached maturity. In the third N experiment, above- and below-ground traits were measured at four-leaf stage, stem elongation, heading, post-anthesis, and maturity. At maturity, there was a ~15-fold difference between lines with the largest and smallest root dry matter. However, only ~5-fold difference was observed between genotypes for above-ground biomass. In the third experiment, root growth did not significantly change from stem elongation to maturity. </p><p>In the final experiment, two of these lines were selected for further evaluation under well-watered and drought treatments. This experiment was implemented in a completely randomized design in the Controlled Environment Phenotyping Facility (CEPF) at Purdue University. The differential water treatments were imposed at stem elongation and continued until post-anthesis, when all plants were destructively phenotyped. Image-based height and side-projected area were associated with height and shoot dry matter with correlations of r=1 and r=0.98, respectively. Additionally, 81% of the variation in tiller number was explained using convex hull and side-projected area. Image-based phenotypes were used to model crop growth temporally, through which one of the lines was identified as being relatively more drought tolerant. Finally, the use of the Munsell Color System was explored to investigate drought response.</p><p>These experiments illustrate the value of phenotyping and the use of novel phenotyping strategies in wheat breeding to increase adaptation and development of lines with enhanced abiotic tolerance.</p></div><br>
84

Development of an automated system for the measurement of focal ratio degradation of high numerical aperture fibres

Lee, Jooyoung 07 August 2019 (has links)
The thesis presents the development and testing of an automated fibre optic test system for the measurement of focal ratio degradation (FRD) in high numerical aperture fibres. In particular, the fibres under examination are being proposed for use in the Maunakea Spectroscopic Explorer (MSE), a new telescope currently being designed for wide-field surveys of the night sky. A critical subsystem of the MSE is the Fiber Transmission System (FiTS) that connects the focal plane to the telescope’s spectrographs. In preparation for MSE-FiTS, a method of characterizing the focal ratio degradation (FRD), between the input and output of every fibre, of candidate multi-mode fibres is highly important. The ultimate goal is the testing of all 4,332 fibres after assembly and prior to installation on MSE. An optical bench has been constructed to test the performance of an automated characterization system; a variation on the collimated beam test. Herein we present the underlying analysis FRD measurement method, the optical design of the test bench, the motion control system and the software for measuring FRD, and controlling the automated test system. The open-source automation software is also introduced; the Big FiTS Fibre Wrapper (Big FFW). The results of tests performed using the Big FFW on samples of candidate fibres are presented and compared with the results in the literature using manual methods. The results suggest that the candidate MSE fibre meets the science requirement of less than 5% focal ratio degradation for an f/2 input beam measured at the fibre output. There is less than 1% disagreement between the automated measurement method and manual methods reported in the literature. The fully automated system can measure the FRD of up to 10 fibres in a typical MSE fibre bundle configuration. / Graduate
85

Identification of biologically-active PDE11-selective inhibitors using a yeast-based high throughput screen

Ceyhan, Ozge January 2012 (has links)
Thesis advisor: Charles S. Hoffman / The biological roles of the most recently discovered mammalian cyclic nucleotide phosphodiesterase (PDE) family, PDE11, are poorly understood, in part due to the lack of selective inhibitors. To address this need for such compounds I completed a ~200,000 compound high throughput screen (HTS) for PDE11 inhibitors using a yeast-based growth assay. Further characterization of lead candidates using both growth-based assays in the fission yeast Schizosaccharomyces pombe and in vitro enzyme assays identified four potent and selective PDE11 inhibitors. I examined the effect of these compounds on human adrenocortical cells, where PDE11 is believed to regulate cortisol levels. One compound, along with two structural analogs, elevates cAMP levels and cortisol production through PDE11 inhibition, thus phenocopying the behavior of adrenocortical tumors associated with Cushing syndrome. These compounds can be used as research tools to study the biological function of PDE11, and can also serve as leads to develop therapeutic compounds for the treatment of adrenal insufficiencies. This study further validates the yeast-based HTS platform as a powerful tool for the discovery of potent, selective and biologically-active PDE inhibitors. / Thesis (PhD) — Boston College, 2012. / Submitted to: Boston College. Graduate School of Arts and Sciences. / Discipline: Biology.
86

The response of soil microbial communities to vegetable cropping systems analyzed for RNA- and DNA-based sampling

Gomez-Montano, Lorena January 1900 (has links)
Doctor of Philosophy / Department of Plant Pathology / Ari Jumpponen / Megan Kennelly / Soil microbial communities play fundamental and complex roles in the productivity of agriculture. However, we still have a limited understanding of the response of microbial communities to different farming systems, such as organic and conventional fertility management regimens. We applied high-throughput sequencing to develop a better understanding of how soil microbial communities (bacteria and fungi) in vegetable production respond to organic or conventional soil fertility management. Specifically, my three studies examined the following questions: 1. How do soil microbial communities from cDNA and DNA samples compare in organic and conventional fertility treatments? 2. How do soil microbial communities in a tomato cropping season respond to long-term organic vs. conventional soil fertility treatments? 3. How do soil bacterial and fungal communities respond to high tunnels, plastic mulch and organic amendments across a tomato cropping season? The first two questions were addressed at the Kansas State University Horticulture and Extension Center in Olathe, KS, using organic and conventional field plots with three levels of fertilizer. We sampled the plots during the development of a tomato crop. The third question was addressed at a commercial farm in Lawrence, KS, during its transition to organic vegetable production, during a tomato crop. The Lawrence experiment included as treatments field plots versus high tunnels, and three organic nutrient amendments. We used 454-pyrosequencing of bacterial and fungal ribosomal markers to compare total resident (DNA) and active microbial communities (cDNA, which is DNA synthesized from a single stranded RNA template) for our first question. We used Illumina MiSeq metabarcoding of bacterial and fungal ribosomal markers for our second and third questions. In all three studies we evaluated bacterial and fungal community responses using Simpson´s diversity index, Simpson´s evenness and richness for each experiment. For the first question, when we compared DNA and cDNA, bacterial diversity was higher in cDNA samples from organic compared to conventional management. In addition, fungal diversity from cDNA samples was higher than from DNA samples. In contrast, in the second question, bacterial and fungal diversity indices did not differ in the tomato crop under organic and conventional management systems. For our third question, high tunnels did not affect bacterial or fungal diversity. Use of plastic mulch for a tomato crop in open field plots did not affect bacterial richness, but decreased fungal richness compared to open field plots without plastic mulch. High-throughput sequencing provides a new perspective on the structure and dynamics of these communities. Information from this approach will ultimately improve our ability to manage soil for sustainable productivity by promoting beneficial microorganisms and suppressing pathogenic ones.
87

Sifter-T: Um framework escalável para anotação filogenômica probabilística funcional de domínios protéicos / Sifter-T: A scalable framework for phylogenomic probabilistic protein domain functional annotation

Silva, Danillo Cunha de Almeida e 25 October 2013 (has links)
É conhecido que muitos softwares deixam de ser utilizados por sua complexa usabilidade. Mesmo ferramentas conhecidas por sua qualidade na execução de uma tarefa são abandonadas em favor de ferramentas mais simples de usar, de instalar ou mais rápidas. Na área da anotação funcional a ferramenta Sifter (v2.0) é considerada uma das com melhor qualidade de anotação. Recentemente ela foi considerada uma das melhores ferramentas de anotação funcional segundo o Critical Assessment of protein Function Annotation (CAFA) experiment. Apesar disso, ela ainda não é amplamente utilizada, provavelmente por questões de usabilidade e adequação do framework à larga escala. O workflow SIFTER original consiste em duas etapas principais: A recuperação das anotações para uma lista de genes e a geração de uma árvore de genes reconciliada para a mesma lista. Em seguida, a partir da árvore de genes o Sifter constrói uma rede bayesiana de mesma estrutura nas quais as folhas representam os genes. As anotações funcionais dos genes conhecidos são associadas a estas folhas e em seguida as anotações são propagadas probabilisticamente ao longo da rede bayesiana até as folhas sem informação a priori. Ao fim do processo é gerada para cada gene de função desconhecida uma lista de funções putativas do tipo Gene Ontology e suas probabilidades de ocorrência. O principal objetivo deste trabalho é aperfeiçoar o código-fonte original para melhor desempenho, potencialmente permitindo que seja usado em escala genômica. Durante o estudo do workflow de pré-processamento dos dados encontramos oportunidades para aperfeiçoamento e visualizamos estratégias para abordá-las. Dentre as estratégias implementadas temos: O uso de threads paralelas; balanceamento de carga de processamento; algoritmos revisados para melhor aproveitamento de disco, memória e tempo de execução; adequação do código fonte ao uso de bancos de dados biológicos em formato utilizado atualmente; aumento da acessibilidade do usuário; expansão dos tipos de entrada aceitos; automatização do processo de reconciliação entre árvores de genes e espécies; processos de filtragem de seqüências para redução da dimensão da análise; e outras implementações menores. Com isto conquistamos aumento de performance de até 87 vezes para a recuperação de anotações e 73,3% para a reconstrução da árvore de genes em máquinas quad-core, e redução significante de consumo de memória na fase de realinhamento. O resultado desta implementação é apresentado como Sifter-T (Sifter otimizado para Throughput), uma ferramenta open source de melhor usabilidade, velocidade e qualidade de anotação em relação à implementação original do workflow de Sifter. Sifter-T foi escrito de forma modular em linguagem de programação Python; foi elaborado para simplificar a tarefa de anotação de genomas e proteomas completos; e os resultados são apresentados de forma a facilitar o trabalho do pesquisador. / It is known that many software are no longer used due to their complex usability. Even tools known for their task execution quality are abandoned in favour of faster tools, simpler to use or install. In the functional annotation field, Sifter (v2.0) is regarded as one of the best when it comes to annotation quality. Recently it has been considered one of the best tools for functional annotation according to the \"Critical Assessment of Protein Function Annotation (CAFA) experiment. Nevertheless, it is still not widely used, probably due to issues with usability and suitability of the framework to a high throughput scale. The original workflow SIFTER consists of two main steps: The annotation recovery for a list of genes and the reconciled gene tree generation for the same list. Next, based on the gene tree, Sifter builds a Bayesian network structure in which its leaves represent genes. The known functional annotations are associated to the aforementioned leaves, and then the annotations are probabilistically propagated along the Bayesian network to the leaves without a priori information. At the end of the process, a list of Gene Ontology functions and their occurrence probabilities is generated for each unknown function gene. This work main goal is to optimize the original source code for better performance, potentially allowing it to be used in a genome-wide scale. Studying the pre-processing workflow we found opportunities for improvement and envisioned strategies to address them. Among the implemented strategies we have: The use of parallel threads; CPU load balancing, revised algorithms for best utilization of disk access, memory usage and runtime; source code adaptation to currently used biological databases; improved user accessibility; input types increase; automatic gene and species tree reconciliation process; sequence filtering to reduce analysis dimension, and other minor implementations. With these implementations we achieved great performance speed-ups. For example, we obtained 87-fold performance increase in the annotation recovering module and 72.3% speed increase in the gene tree generation module using quad-core machines. Additionally, significant memory usage decrease during the realignment phase was obtained. This implementation is presented as Sifter-T (Sifter Throughput-optimized), an open source tool with better usability, performance and annotation quality when compared to the Sifter\'s original workflow implementation. Sifter-T was written in a modular fashion using Python programming language; it is designed to simplify complete genomes and proteomes annotation tasks and the outputs are presented in order to make the researcher\'s work easier.
88

Optimization of Packet Throughput in Docker Containers

Ginka, Anusha, Salapu, Venkata Satya Sameer January 2019 (has links)
Container technology has gained popularity in recent years, mainly because it enables a fast and easy way to package, distribute and deploy applications and services. Latency and throughput have a high impact on user satisfaction in many real-time, critical and large-scale online services. Although the use of microservices architecture in cloud-native applications has enabled advantages in terms of application resilience, scalability, fast software delivery and the use of minimal resources, the packet processing rates are not correspondingly higher. This is mainly due to the overhead imposed by the design and architecture of the network stack. Packet processing rates can be improved by making changes to the network stack and without necessarily adding more powerful hardware. In this research, a study of various high-speed packet processing frameworks is presented and a software high-speed packet I/O solution i.e., as hardware agnostic as possible to improve the packet throughput in container technology is identified. The proposed solution is identified based on if the solution involves making changes to the underlying hardware or not. The proposed solution is then evaluated in terms of packet throughput for different container networking modes. A comparison of the proposed solution with a simple UDP client-server application is also presented for different container networking modes. From the results obtained, it is concluded that packet mmap client-server application has higher performance when compared with simple UDP client-server application.
89

Development of Methods for the Discovery of Small Molecule Biological Probes

Yozwiak, Carrie Elizabeth January 2017 (has links)
Advances in combinatorial chemistry have facilitated the production of large chemical libraries that can be used as tools to discover biological probes and therapeutics. High-throughput screening (HTS) strategies have emerged as the standard method to assess the biological activity of small molecules. These screens involve the individual analysis of each small molecule in multi-well plates, often requiring expensive automated methods and development of robust assays that may not translate to physiologically relevant contexts. This problem of evaluating large numbers of reagents in physiologically relevant cell and animal models has been addressed for genetic reagents such as RNAi, CRISPR, and cDNA by creating barcoded retroviral libraries that can be used to infect target cells in culture or in animal models. Using these tools, effective reagents can be selected and decoded using a rapid and inexpensive procedure compared to testing of individual reagents one at a time in an arrayed fashion. In order to more efficiently analyze small molecules, a pooled approach would similarly be useful. This dissertation describes the studies towards developing a pooled screening strategy for small molecules in cellular contexts. Through an initial screen, we set to phenotypically profile small molecule biological activity in a pooled fashion, while simultaneously gain insight about an individual, active molecule’s mechanism of action. I first describe the design of the pooled screen and define the goals necessary for successful application. Next, I outline the steps taken and challenges encountered during the invention of each component of the technology. Finally, I discuss a computational, target-based approach to design small molecules appropriate for future applications of the new screening technology.
90

Desenvolvimento de uma arquitetura parametrizável para processamento da pilha TCP/IP em hardware / Development of a customizable architecture to TCP/IP stack processing in hardware

Hamerski, Jean Carlo January 2008 (has links)
O aumento da popularidade da Internet e a criação de novos meios de transmissão estimulam um explosivo crescimento da taxa de transmissão de dados sobre a Internet. Assim, o processamento TCP/IP baseado em software torna-se um gargalo por não processar os pacotes na velocidade das linhas de transmissão, em especial os pacotes da camada de transporte. Conseqüentemente, surge a necessidade de implementação em hardware do processamento TCP/IP, o que traria vantagens como aceleração do processamento do fluxo de dados. Neste sentido, este trabalho apresenta a arquitetura do iNetCore, descrita em VHDL, para processamento dos protocolos das camadas de rede e transporte em hardware. Duas implementações desta arquitetura foram elaboradas, buscando explorar o espaço de projeto e analisar os resultados obtidos na síntese para a tecnologia ASIC e FPGA, e o desempenho no processamento de pacotes. Uma arquitetura HW/SW contendo o iNetCore foi prototipada sobre a placa Virtex- II Pro Development System. Em conjunto com essa arquitetura, foi implementada uma interface de comunicação com o barramento OPB, tornando possível a implementação de softwares da camada de aplicação que queiram usar a pilha TCP/IP desenvolvida em hardware. Por fim, foram efetuados experimentos para avaliar o desempenho da arquitetura HW/SW no processamento de segmentos TCP. A arquitetura HW/SW em conjunto com o iNetCore alcançou um throughput de até 1,45 Gbps, possibilitando o uso da arquitetura para processamento de pacotes TCP/IP na plenitude de banda disponíveis em redes gigabit. / The advent of new transmission lines stimulates an explosive increase of the Internet data-transmission rate. Thus, the TCP/IP processing based on software became a bottleneck, because it cannot reach the transmission line speed required, specially in the transmission of transport layer packets. This limitation brings the necessity of implementation of the TCP/IP processing in hardware, what it would bring advantages in the acceleration of data flow processing. In this way, this work presents the iNetCore architecture, described in VHDL, able to process the transport and network layers protocols in hardware. Two implementations of this architecture were implemented. The objective is to explore the design space and to analyze the results in ASIC and FPGA technology synthesis. Also, a simulation environment was built to analyze the performance in the packets computation. A HW/SW architecture containing the iNetcore was prototyped on Virtex-II Pro Development System board. In conjunction with this architecture, it was implemented a communication interface with OPB bus, which makes possible the development of application layer softwares that may use the hardware TCP/IP stack developed. Finally, experiments were realized in order to evaluate the HW/SW architecture performance in the TCP segments processing. The HW/SW architecture together with the iNetCore reached a throughput of about 1.45 Gbps in the TCP/IP packets processing. It proves its potential to use available bandwidth in gigabit networks.

Page generated in 0.0423 seconds