• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 500
  • 92
  • 71
  • 61
  • 36
  • 21
  • 19
  • 18
  • 13
  • 7
  • 5
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 1023
  • 688
  • 265
  • 180
  • 130
  • 125
  • 117
  • 97
  • 81
  • 80
  • 79
  • 77
  • 67
  • 64
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Delay Modeling In Data Center Networks: A Taxonomy and Performance Analysis

Alshahrani, Reem Abdullah 06 August 2013 (has links)
No description available.
52

Chemical-genetics identifies two mechanistically unique spiro-analogs: an inhibitor of bacterial iron homeostasis and a zinc chelator that re-sensitizes a metallo-beta-lactamase-producing pathogen to carbapenem antibiotics / Antibacterial activity through metal chelation

Falconer, Shannon Beth January 2014 (has links)
Concomitant with antibiotic use is the development of bacterial strains that are resistant to such compounds. Presently, the rate at which antibiotic-resistant pathogenic bacteria are emerging is outpacing our resupply of new antibacterials; therefore, renewed efforts to identify novel therapies are urgently needed. Transition metals are required by all life forms and, for bacteria, an adequate supply of nutrient metal is necessary to establish infection in a host. Indeed, as an antibacterial defense mechanism, eukaryotes have developed various means by which to restrict the availability of metal to the invading pathogen, thereby limiting its chances for successful colonization. As such, bacterial metal acquisition and homeostasis have been suggested as potential antibiotic targets to explore for the identification of new antibacterial small molecules. In this thesis I discuss my development of a high-throughput screening assay that specifically selects for compounds that perturb bacterial iron homeostasis. The results of this work led to the identification of a series of spiro-indoline-thiadiazole compounds that are toxic to bacteria via iron chelation. In addition to molecules that perturb the availability of bacterial intracellular iron, we present a series of spiro-indoline-thiadiazole analogs that inhibit bacterial growth by limiting zinc availability. Furthermore, we show that the respective zinc-perturbing analogs re-sensitize an otherwise drug-resistant strain of NDM-1-harbouring Klebsiella pneumoniae to carbapenem antibiotics. We discuss the potential for this class of compounds to serve as carbapenem adjuvants for treating infections caused by metallo-β-lactamase-containing pathogens. / Thesis / Doctor of Philosophy (PhD)
53

The High-Throughput Micro-Adhesion Tester

Collis, Andrea 02 1900 (has links)
<p> The high-throughput micro-adhesion tester (HMAT) was constructed to test the adhesive strength of polymers. The design criteria included the ability to rapidly test many different samples in a serial format, and a probe design that would compliment this objective by being easy to place and pull from the samples and easy to mass produce. The HMAT was able to perform 48 adhesion tests at about 30s per test for a total of 24 min. The final probes were made from a capillary tube with a small metal cap on the top for ease of lifting. They are easy to make and easy to place and pull from the custom probe box. The probe box was designed to hold the probes upright while the polymer is drying and during the test while not interfering with the test itself. Tests on PDMS show reasonable repeatability with the standard deviation being about 20% of the mean value. Since the HMAT is meant to be used for primary screening, the accuracy of the measurements is not as critical as it would be for later tests. </p> / Thesis / Master of Applied Science (MASc)
54

Throughput Optimization and Transmitter Power Saving (TOTPS) Algorithm and Extended TOTPS (ETOTPS) Algorithm for IEEE 802.11 Links

Mo, Tianmin 30 October 2006 (has links)
The IEEE 802.11 wireless local area network (WLAN) standard supports multiple transmission modes. However, the higher mandatory data rate mode does not necessarily yield higher throughput. This research started from the relationship between the link throughput and the channel's carrier-to-noise (C/N) ratio. Two algorithms are proposed, a throughput optimization and transmitter power saving (TOTPS) algorithm and an extended throughput optimization and transmitter power saving (ETOTPS) algorithm, based on the knowledge of the C/N ratio at the receiver. In particular, we take the approach of adjusting link parameters like transmitter power and transmission mode to achieve the maximum throughput at different C/N values. Since the TOTPS algorithm tends to reduce the transmitter power without degrading the link throughput, transmitter power can be saved. This not only prolongs battery life, which is critical in ad hoc wireless networks, but also reduces the potential interference to neighboring wireless network systems. The ETOTPS algorithm, on the other hand, aims for higher throughput by trading in more transmitter power. This is particularly desired for high-speed data transfer in an emergency situation. Both algorithms are developed to be applied to IEEE 802.11b, IEEE 802.11a and IEEE 802.11g links. / Ph. D.
55

Sequencing policy for a CONWIP production system

Greco, Michael P. 29 August 2008 (has links)
The optimization of the performance of a constant Work-in-Progress (CONWIP) production system through the sequencing of its backlog list is investigated. The performance measures considered are throughput, optimum WIP level (m*), and flow time. Analysis of the effects of sequence dependent bottlenecks on system performance is provided. A procedure is presented to determine a lower bound for (m*) given the product mix. A method that determines (m*) given the sequence of jobs is provided. A heuristic algorithm is provided for the purpose of determining a sequence to minimize (m*). The algorithm attempts to sequence the jobs to achieve the "best fit" between consecutive jobs so that machine and job idle times are minimized. The algorithm is tested through computer implementation to reveal its proficiency. / Master of Science
56

Protein Bioseparation using Synthetic Membranes: Enhancement of Selectivity and Throughput

Kanani, Dharmeshkumar M. January 2007 (has links)
Cost-effective large-scale protein bioseparation will be the key issue for the biopharmaceutical industry in the coming years. Conventional protein purification techniques are severely limited in the sense that they give either good selectivity of separation at the cost of throughput or vice versa. Synthetic membrane based bioseparation techniques such as high-resolution ultrafiltration and membrane chromatography have the potential to combine high-throughput with high selectivity. This thesis focuses on approaches for obtaining both selectivity and throughput in membrane based protein bioseparation processes. Obtaining high selectivity is one of the main objectives in high-resolution ultrafiltration. This thesis reports a novel approach for flexibly manipulating the selectivity of protein separation using a dual-facilitating agent. In this study it has been shown for the first time that the selectivity of separation can be altered as desired, i.e. if required, the selectivity can be reversed and thereby smaller proteins can be retained and larger proteins can be made to permeate by using a dual-facilitating agent. The results are explained in terms of protein-protein electrostatic interactions and Donnan effect. This novel approach is expected to significantly increase the flexibility of carrying out high-resolution ultrafiltration. Membrane chromatography is based on the use of stacks of microporous synthetic membranes as chromatographic media. Due to lower binding capacities of commercial membranes in comparison to conventional beads for packed bed chromatography, the commercial success of membrane chromatography is largely limited to the flow-through applications. The study on membrane chromatography addresses the performances of new types of high-capacity macroporous gel-filled membranes for ion-exchange chromatography of proteins. This work demonstrates the suitability of using one of these novel membranes for fractionation of plasma proteins. Membrane fouling reduces product throughput and is considered a major problem in pressure driven membrane processes such as microfiltration and ultrafiltration. This thesis reports some significant contributions in the area of membrane fouling. A novel yet conceptually simple approach for modeling flux decline in constant pressure ultrafiltration, which takes into account the interplay between flux, concentration polarization and membrane fouling is discussed. Conventional fouling models account for the effects of concentration polarization and membrane fouling in a simple additive way. The basic hypothesis in the model discussed here is that flux decline in constant pressure ultrafiltration is self-attenuating in nature. This new approach is expected to be very useful in deciding the start-up conditions in membrane processes. Despite widespread use of in-line microfiltration for sterilization of therapeutic proteins prior to formulation, there has been no systematic study on fouling in such processes. Part of the fouling work in this thesis examines how resistance to filtration increases during in-line microfiltration of concentrated protein solution and the mechanism of protein fouling. It assesses the severity of fouling in terms of apparent reversible fouling and irreversible fouling. Traditional methods to measure the protein fouling resistances of membranes are time consuming and expensive. This thesis reports three protocols to compare the performance of microfiltration membranes for protein filtration. The first protocol, which is based on accelerated fouling in the dead end mode using pulsed injection technique is rapid, simple, and cost effective and gives valuable information about membrane performance. The remaining two protocols are based on the critical flux concept. / Thesis / Doctor of Philosophy (PhD)
57

A Computer Vision Tool For Use in Horticultural Research

Thoreson, Marcus Alexander 13 February 2017 (has links)
With growing concerns about global food supply and environmental impacts of modern agriculture, we are seeing an increased demand for more horticultural research. While research into plant genetics has seen an increased throughput from recent technological advancements, plant phenotypic research throughput has lagged behind. Improvements in open-source image processing software and image capture hardware have created an opportunity for the development of more competitively-priced, faster data-acquisition tools. These tools could be used to collect measurements of plants' phenotype on a much larger scale without sacrificing data quality. This paper demonstrates the feasibility of creating such a tool. The resulting design utilized stereo vision and image processes in the OpenCV project to measure a representative collection of observable plant traits like leaflet length or plant height. After the stereo camera was assembled and calibrated, visual and stereo images of potato plant canopies and tubers(potatoes) were collected. By processing the visual data, the meaningful regions of the image (the canopy, the leaflets, and the tubers) were identified. The same regions in the stereo images were used to determine plant physical geometry, from which the desired plant measurements were extracted. Using this approach, the tool had an average accuracy of 0.15 inches with respect to distance measurements. Additionally, the tool detected vegetation, tubers, and leaves with average Dice indices of 0.98, 0.84, and 0.75 respectively. To compare the tool's utility to that of traditional implements, a study was conducted on a population of 27 potato plants belonging to 9 separate genotypes. Both newly developed and traditional measurement techniques were used to collect measurements of a variety of the plants' characteristics. A multiple linear regression of the plant characteristics on the plants' genetic data showed that the measurements collected by hand were generally better correlated with genetic characteristics than those collected using the developed tool; the average adjusted coefficient of determination for hand-measurements was 0.77, while that of the tool-measurements was 0.66. Though the aggregation of this platform's results is unsatisfactory, this work has demonstrated that such an alternative to traditional data-collection tools is certainly attainable. / Master of Science / With growing concerns about global food supply and environmental impacts of modern agriculture, we are seeing an increased demand for more horticultural research. While research into plant genetics has seen an increased throughput from recent technological advancements, the throughput of research into how those genetic traits are expressed (plant phenotype) has lagged behind. Improvements in open-source image processing software and image capture hardware have created an opportunity for the development of more competitively-priced, faster data-acquisition tools. These tools could be used to collect measurements of plants’ phenotype on a much larger scale without sacrificing data quality. This paper demonstrates the feasibility of creating such a tool. The tool developed in this work was an array of two USB-webcams that was capable of producing distance measurements. This was largely made possible by using software written in the C++ programming language maintained by the OpenCV project. The tool’s effectiveness was evaluated by comparing its measurement-taking ability to that of horticultural researchers measuring by hand. This comparison was made by using both measurement collection methods in the study of a population of potato plants. The result of this comparison was evidence that although the tool developed in this work was overall less effective at generating relevant measurements, more work on the project could yield improvements. Additionally, the tool developed improved the time spent per plant during measurement from 120 seconds to 14 seconds on average.
58

Uma proposta metodológica para a previsão do Throughput durante a inicialização de redes Profinet através de redes neurais artificiais / A proposal of a methodology to preview Throughput in Profinet network using Artificial Neural Networks

Sestito, Guilherme Serpa 28 November 2014 (has links)
Este trabalho propõe o desenvolvimento de uma metodologia para o cálculo do volume de tráfego durante o período de inicialização de uma rede Profinet. O tráfego de dados é um dos indicadores de desempenho criados para garantir a qualidade dos protocolos baseados em Real Time Ethernet (RTE). Neste contexto, buscou-se na literatura uma forma de classificar o tráfego de acordo com a sua magnitude e mensurar seu efeito na comunicação. Dados provenientes de redes criadas em laboratório foram coletados e aplicados a uma Rede Neural Artificial visando generalizar o conhecimento adquirido. O uso dado a RNA foi de estimação da função de interesse. Os resultados obtidos após o processamento dos dados reais são considerados satisfatórios e condizentes às expectativas dessa dissertação, já que se buscou, por razões inerentes ao problema estudado, um erro relativo inferior 3%. Conclui-se que a metodologia apresentada é factível e aplicável ao meio industrial, podendo ser parte de uma ferramenta mais completa, como os analisadores de redes Profinet. / This paper suggests the development of a methodology to calculate the traffic volume during the starting period of a Profinet network. The data traffic is one of the development indicators created to guarantee the protocols quality based on Real Time Ethernet (RTE). In this context, a way of classifying the traffic according to its magnitude and of measuring its effect in the communication was searched in the literature. Data deriving from networks created in laboratory were collected and applied into an Artificial Neural Network aiming to generalize the acquired knowledge. The ANN was used to estimate the function of interest. The results obtained after the real data processing are considered satisfactory and suitable to the expectations of this dissertation where the relative error inferior to 3%, for reasons intrinsic to the studied problem, was searched. It is concluded that the methodology presented is feasible and applicable in the industrial field, where it can be part of a more complete tool, as the Profinet network analyzers.
59

Delay-Throughput Analysis in Distributed Wireless Networks

Abouei, Jamshid January 2009 (has links)
A primary challenge in wireless networks is to use available resources efficiently so that the Quality of Service (QoS) is satisfied while maximizing the throughput of the network. Among different resource allocation strategies, power and spectrum allocations have long been regarded as efficient tools to mitigate interference and improve the throughput of the network. Also, achieving a low transmission delay is an important QoS requirement in buffer-limited networks, particularly for users with real-time services. For these networks, too much delay results in dropping some packets. Therefore, the main challenge in networks with real-time services is to utilize an efficient power allocation scheme so that the delay is minimized while achieving a high throughput. This dissertation deals with these problems in distributed wireless networks.
60

High-throughput sequencing and small non-coding RNAs

Langenberger, David 29 April 2013 (has links) (PDF)
In this thesis the processing mechanisms of short non-coding RNAs (ncRNAs) is investigated by using data generated by the current method of high-throughput sequencing (HTS). The recently adapted short RNA-seq protocol allows the sequencing of RNA fragments of microRNA-like length (∼18-28nt). Thus, after mapping the data back to a reference genome, it is possible to not only measure, but also visualize the expression of all ncRNAs that are processed to fragments of this specific length. Short RNA-seq data was used to show that a highly abundant class of small RNAs, called microRNA-offset-RNAs (moRNAs), which was formerly detected in a basal chordate, is also produced from human microRNA precursors. To simplify the search, the blockbuster tool that automatically recognizes blocks of reads to detect specific expression patterns was developed. By using blockbuster, blocks from moRNAs were detected directly next to the miR or miR* blocks and could thus easily be registered in an automated way. When further investigating the short RNA-seq data it was realized that not only microRNAs give rise to short ∼22nt long RNA pieces, but also almost all other classes of ncRNAs, like tRNAs, snoRNAs, snRNAs, rRNAs, Y-RNAs, or vault RNAs. The formed read patterns that arise after mapping these RNAs back to a reference genome seem to reflect the processing of each class and are thus specific for the RNA transcripts of which they are derived from. The potential of this patterns in classification and identification of non-coding RNAs was explored. Using a random forest classifier which was trained on a set of characteristic features of the individual ncRNA classes, it was possible to distinguish three types of ncRNAs, namely microRNAs, tRNAs, and snoRNAs. To make the classification available to the research community, the free web service ‘DARIO’ that allows to study short read data from small RNA-seq experiments was developed. The classification has shown that read patterns are specific for different classes of ncRNAs. To make use of this feature, the tool deepBlockAlign was developed. deepBlockAlign introduces a two-step approach to align read patterns with the aim of quickly identifying RNAs that share similar processing footprints. In order to find possible exceptions to the well-known microRNA maturation by Dicer and to identify additional substrates for Dicer processing the small RNA sequencing data of a Dicer knockdown experiment in MCF-7 cells was re-evaluated. There were several Dicer-independent microRNAs, among them the important tumor supressor mir-663a. It is known that many aspects of the RNA maturation leave traces in RNA sequencing data in the form of mismatches from the reference genome. It is possible to recover many well- known modified sites in tRNAs, providing evidence that modified nucleotides are a pervasive phenomenon in these data sets.

Page generated in 0.041 seconds