• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 255
  • 168
  • 43
  • 22
  • 9
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 3
  • Tagged with
  • 596
  • 223
  • 208
  • 119
  • 111
  • 101
  • 60
  • 55
  • 54
  • 51
  • 49
  • 46
  • 46
  • 46
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
581

Termodiagnostika - dotykové a bezdotykové měření teploty / Thermodiagnostics – contact and contact-free temperature measurements

Mikula, Martin January 2014 (has links)
This thesis is concerned with thermodiagnostics in industrial practise, which is very important for the assessment of technical condition of object on the basis of temperature, in today's time. It includes summary of contact and contact-free methods and their principle, advantages and disadvantages for aplication in industrial practise. Because of thesis it was carried out measurement in company Daikin Device Czech republic with the use of contact thermometer and two available thermocameras for solving of topical tasks relating to production.
582

Antibacterial activity of the crude extract and fractions of spirostachys africana against multi-drug resistant bacteria

Ajmal, Antoinette Alliya 05 1900 (has links)
MSc (Microbiology) / Department of Microbiology / Background: The high on-going incidences of infectious diseases, specifically those caused by multi-drug resistant bacteria in the last decade has made it a necessity to investigate a variety of antimicrobial drug sources, such as plants. Medicinal plants have played a significant role in drug discovery for western pharmaceuticals recently and have also been used successfully by traditional healers and herbalists to treat various infectious diseases for centuries. Currently, a few medicinal plants are commercialized, reason being most medicinal plants phytochemicals have not been studied yet, although they have been traditionally used by healers. Due to the constant development of multi-drug resistance of bacteria to antibiotics, S. africana extracts can provide an opportunity to finding new antibacterial compounds that can be used as the foundation for formulating new antimicrobial drugs. Objectives: The aim of this study was to screen antibacterial activity of the crude extract and fractions of S. africana against multi-drug resistant bacteria and to also evaluate other biological properties. Methods: Preliminary screening of phytochemical constituents of S. africana and fractions was done using standard qualitative and quantitative methods. Antibacterial activity of the extracts was evaluated using the agar well diffusion method and the microdilution assay against MDR bacterial strains. Antioxidant activities of the MCE and its fractions were measured by DPPH and reducing power assays, and the toxicity of the MCE and its fractions was tested on Vero cells using Cell-based high content screening assay. Results: Phytochemical analysis of the MCE and fractions obtained in this study showed the presence of phenolics, flavonoids, alkaloids, steroids, saponins, cardiac glycosides and terpenoids in most of S. african’s test samples. Fraction F1 and F2 both lacked alkaloids and saponins. The micro-plate dilution assay demonstrated that the MCE and all its fractions can inhibit the growth of all selected MDR bacterial strains tested against at different concentrations (0.1mg/ml to >12.5mg/ml), wherein the lowest MIC averages were obtained from fractions F3 and F6, with 0.59 mg/ml and 0.71 mg/ml MIC averages respectively. Contrary to the micro-plate dilution assay, the well diffusion assay demonstrated that MCE and all its fractions were not active against all the selected MDR bacterial strains tested against, as no inhibition was shown against the growth of K. pneumonia by any of S. african’s test samples. For DPPH assay, the IC50 of S. african’s test samples ranged between 0.01 ±0.34 mg/ml to 0.62 ± 0.05 mg/ml, whiles for the reducing power assay, EC50 measured ranged between 0.61 ± 0.01 mg/ml and 11.30 ± 0.04 mg/ml. The MCE and fraction F2 exhibited the highest toxicity to Vero cells. Conclusion: The MCE and fractions of the plant S. africana have antibacterial activity against MDR bacterial strains, beneficial biological properties and contains potential antibacterial compounds that may be valuable in the discovery of new potential drugs for treatment of infectious diseases / NRF
583

Intégration de l'analyse de signaux biométriques dans un environnement de réalité virtuelle pour la détection par apprentissage automatique des facultés d'une personne

Boisclair, Jonathan January 2019 (has links) (PDF)
No description available.
584

Exploring the mechanism of action of spore photoproduct lyase

Nelson, Renae 27 August 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Spore photoproduct lyase (SPL) is a radical SAM (S-adenosylmethionine) enzyme that is responsible for the repair of the DNA UV damage product 5-thyminyl-5,6-dihydrothymine (also called spore photoproduct, SP) in the early germination phase of bacterial endospores. SPL initiates the SP repair process using 5'-dA• (5'-deoxyadenosyl radical) generated by SAM cleavage to abstract the H6proR atom which results in a thymine allylic radical. These studies provide strong evidence that the TpT radical likely receives an H atom from an intrinsic H atom donor, C141 in B. subtilis SPL. I have shown that C141 can be alkylated in native SPL by iodoacetamide treatment indicating that it is accessible to the TpT radical. Activity studies demonstrate a 3-fold slower repair rate of SP by C141A which produces TpTSO2 - and TpT simultaneously with no lag phase observed for TpTSO2- formation. Additionally, formation of both products shows a Dvmax kinetic isotope effect (KIE) of 1.7 ± 0.2 which is smaller than the DVmax KIE of 2.8 ± 0.3 for the WT SPL reaction. Removal of the intrinsic H atom donor by this single mutation disrupts the rate-limiting process in the enzyme catalysis. Moreover, C141A exhibits ~0.4 turnover compared to the > 5 turnovers in the WT SPL reaction. In Y97 and Y99 studies, structural and biochemical data suggest that these two tyrosine residues are also crucial in enzyme catalysis. It is suggested that Y99 in B. subtilis SPL uses a novel hydrogen atom transfer pathway utilizing a pair of cysteinetyrosine residues to regenerate SAM. The second tyrosine, Y97, structurally assists in SAM binding and may also contribute to SAM regeneration by interacting with radical intermediates to lower the energy barrier for the second H-abstraction step.
585

Modeling Eye Movement for the Assessment of Programming Proficiency

Al Madi, Naser S. 26 July 2020 (has links)
No description available.
586

An Exploration of the Word2vec Algorithm: Creating a Vector Representation of a Language Vocabulary that Encodes Meaning and Usage Patterns in the Vector Space Structure

Le, Thu Anh 05 1900 (has links)
This thesis is an exloration and exposition of a highly efficient shallow neural network algorithm called word2vec, which was developed by T. Mikolov et al. in order to create vector representations of a language vocabulary such that information about the meaning and usage of the vocabulary words is encoded in the vector space structure. Chapter 1 introduces natural language processing, vector representations of language vocabularies, and the word2vec algorithm. Chapter 2 reviews the basic mathematical theory of deterministic convex optimization. Chapter 3 provides background on some concepts from computer science that are used in the word2vec algorithm: Huffman trees, neural networks, and binary cross-entropy. Chapter 4 provides a detailed discussion of the word2vec algorithm itself and includes a discussion of continuous bag of words, skip-gram, hierarchical softmax, and negative sampling. Finally, Chapter 5 explores some applications of vector representations: word categorization, analogy completion, and language translation assistance.
587

Genetic regulation of virulence factors contributing to colonization and pathogenesis of helicobacter pylori

Baker, Patrick Ericson 14 October 2003 (has links)
No description available.
588

Bean Soup Translation: Flexible, Linguistically-motivated Syntax for Machine Translation

Mehay, Dennis Nolan 30 August 2012 (has links)
No description available.
589

Sur des méthodes préservant les structures d'une classe de matrices structurées / On structure-preserving methods of a class of structured matrices

Ben Kahla, Haithem 14 December 2017 (has links)
Les méthodes d'algèbres linéaire classiques, pour le calcul de valeurs et vecteurs propres d'une matrice, ou des approximations de rangs inférieurs (low-rank approximations) d'une solution, etc..., ne tiennent pas compte des structures de matrices. Ces dernières sont généralement détruites durant le procédé du calcul. Des méthodes alternatives préservant ces structures font l'objet d'un intérêt important par la communauté. Cette thèse constitue une contribution dans ce domaine. La décomposition SR peut être calculé via l'algorithme de Gram-Schmidt symplectique. Comme dans le cas classique, une perte d'orthogonalité peut se produire. Pour y remédier, nous avons proposé deux algorithmes RSGSi et RMSGSi qui consistent à ré-orthogonaliser deux fois les vecteurs à calculer. La perte de la J-orthogonalité s'est améliorée de manière très significative. L'étude directe de la propagation des erreurs d'arrondis dans les algorithmes de Gram-Schmidt symplectique est très difficile à effectuer. Nous avons réussi à contourner cette difficulté et donner des majorations pour la perte de la J-orthogonalité et de l'erreur de factorisation. Une autre façon de calculer la décomposition SR est basée sur les transformations de Householder symplectique. Un choix optimal a abouti à l'algorithme SROSH. Cependant, ce dernier peut être sujet à une instabilité numérique. Nous avons proposé une version modifiée nouvelle SRMSH, qui a l'avantage d'être aussi stable que possible. Une étude approfondie a été faite, présentant les différentes versions : SRMSH et SRMSH2. Dans le but de construire un algorithme SR, d'une complexité d'ordre O(n³) où 2n est la taille de la matrice, une réduction (appropriée) de la matrice à une forme condensée (J(Hessenberg forme) via des similarités adéquates, est cruciale. Cette réduction peut être effectuée via l'algorithme JHESS. Nous avons montré qu'il est possible de réduire une matrice sous la forme J-Hessenberg, en se basant exclusivement sur les transformations de Householder symplectiques. Le nouvel algorithme, appelé JHSJ, est basé sur une adaptation de l'algorithme SRSH. Nous avons réussi à proposer deux nouvelles variantes, aussi stables que possible : JHMSH et JHMSH2. Nous avons constaté que ces algorithmes se comportent d'une manière similaire à l'algorithme JHESS. Une caractéristique importante de tous ces algorithmes est qu'ils peuvent rencontrer un breakdown fatal ou un "near breakdown" rendant impossible la suite des calculs, ou débouchant sur une instabilité numérique, privant le résultat final de toute signification. Ce phénomène n'a pas d'équivalent dans le cas Euclidien. Nous avons réussi à élaborer une stratégie très efficace pour "guérir" le breakdown fatal et traîter le near breakdown. Les nouveaux algorithmes intégrant cette stratégie sont désignés par MJHESS, MJHSH, JHM²SH et JHM²SH2. Ces stratégies ont été ensuite intégrées dans la version implicite de l'algorithme SR lui permettant de surmonter les difficultés rencontrées lors du fatal breakdown ou du near breakdown. Rappelons que, sans ces stratégies, l'algorithme SR s'arrête. Finalement, et dans un autre cadre de matrices structurées, nous avons présenté un algorithme robuste via FFT et la matrice de Hankel, basé sur le calcul approché de plus grand diviseur commun (PGCD) de deux polynômes, pour résoudre le problème de la déconvolution d'images. Plus précisément, nous avons conçu un algorithme pour le calcul du PGCD de deux polynômes bivariés. La nouvelle approche est basée sur un algorithme rapide, de complexité quadratique O(n²), pour le calcul du PGCD des polynômes unidimensionnels. La complexité de notre algorithme est O(n²log(n)) où la taille des images floues est n x n. Les résultats expérimentaux avec des images synthétiquement floues illustrent l'efficacité de notre approche. / The classical linear algebra methods, for calculating eigenvalues and eigenvectors of a matrix, or lower-rank approximations of a solution, etc....do not consider the structures of matrices. Such structures are usually destroyed in the numerical process. Alternative structure-preserving methods are the subject of an important interest mattering to the community. This thesis establishes a contribution in this field. The SR decomposition is usually implemented via the symplectic Gram-Schmidt algorithm. As in the classical case, a loss of orthogonality can occur. To remedy this, we have proposed two algorithms RSGSi and RMSGSi, where the reorthogonalization of a current set of vectors against the previously computed set is performed twice. The loss of J-orthogonality has significantly improved. A direct rounding error analysis of symplectic Gram-Schmidt algorithm is very hard to accomplish. We managed to get around this difficulty and give the error bounds on the loss of the J-orthogonality and on the factorization. Another way to implement the SR decomposition is based on symplectic Householder transformations. An optimal choice of free parameters provided an optimal version of the algorithm SROSH. However, the latter may be subject to numerical instability. We have proposed a new modified version SRMSH, which has the advantage of being numerically more stable. By a detailes study, we are led to two new variants numerically more stables : SRMSH and SRMSH2. In order to build a SR algorithm of complexity O(n³), where 2n is the size of the matrix, a reduction to the condensed matrix form (upper J-Hessenberg form) via adequate similarities is crucial. This reduction may be handled via the algorithm JHESS. We have shown that it is possible to perform a reduction of a general matrix, to an upper J-Hessenberg form, based only on the use of symplectic Householder transformations. The new algorithm, which will be called JHSH algorithm, is based on an adaptation of SRSH algorithm. We are led to two news variants algorithms JHMSH and JHMSH2 which are significantly more stable numerically. We found that these algortihms behave quite similarly to JHESS algorithm. The main drawback of all these algorithms (JHESS, JHMSH, JHMSH2) is that they may encounter fatal breakdowns or may suffer from a severe form of near-breakdowns, causing a brutal stop of the computations, the algorithm breaks down, or leading to a serious numerical instability. This phenomenon has no equivalent in the Euclidean case. We sketch out a very efficient strategy for curing fatal breakdowns and treating near breakdowns. Thus, the new algorithms incorporating this modification will be referred to as MJHESS, MJHSH, JHM²SH and JHM²SH2. These strategies were then incorporated into the implicit version of the SR algorithm to overcome the difficulties encountered by the fatal breakdown or near-breakdown. We recall that without these strategies, the SR algorithms breaks. Finally ans in another framework of structured matrices, we presented a robust algorithm via FFT and a Hankel matrix, based on computing approximate greatest common divisors (GCD) of polynomials, for solving the problem pf blind image deconvolution. Specifically, we designe a specialized algorithm for computing the GCD of bivariate polynomials. The new algorithm is based on the fast GCD algorithm for univariate polynomials , of quadratic complexity O(n²) flops. The complexitiy of our algorithm is O(n²log(n)) where the size of blurred images is n x n. The experimental results with synthetically burred images are included to illustrate the effectiveness of our approach
590

Towards expressive melodic accompaniment using parametric modeling of continuous musical elements in a multi-attribute prediction suffix trie framework

Mallikarjuna, Trishul 22 November 2010 (has links)
Elements of continuous variation such as tremolo, vibrato and portamento enable dimensions of their own in expressive melodic music in styles such as in Indian Classical Music. There is published work on parametrically modeling some of these elements individually, and to apply the modeled parameters to automatically generated musical notes in the context of machine musicianship, using simple rule-based mappings. There have also been many systems developed for generative musical accompaniment using probabilistic models of discrete musical elements such as MIDI notes and durations, many of them inspired by computational research in linguistics. There however doesn't seem to have been a combined approach of parametrically modeling expressive elements in a probabilistic framework. This documents presents a real-time computational framework that uses a multi-attribute trie / n-gram structure to model parameters like frequency, depth and/or lag of the expressive variations such as vibrato and portamento, along with conventionally modeled elements such as musical notes, their durations and metric positions in melodic audio input. This work proposes storing the parameters of expressive elements as metadata in the individual nodes of the traditional trie structure, along with the distribution of their probabilities of occurrence. During automatic generation of music, the expressive parameters as learned in the above training phase are applied to the associated re-synthesized musical notes. The model is aimed at being used to provide automatic melodic accompaniment in a performance scenario. The parametric modeling of the continuous expressive elements in this form is hypothesized to be able to capture deeper temporal relationships among musical elements and thereby is expected to bring about a more expressive and more musical outcome in such a performance than what has been possible using other works of machine musicianship using only static mappings or randomized choice. A system was developed on Max/MSP software platform with this framework, which takes in a pitched audio input such as human singing voice, and produces a pitch track which may be applied to synthesized sound of a continuous timbre. The system was trained and tested with several vocal recordings of North Indian Classical Music, and a subjective evaluation of the resulting audio was made using an anonymous online survey. The results of the survey show the output tracks generated from the system to be as musical and expressive, if not more, than the case where the pitch track generated from the original audio was directly rendered as output, and also show the output with expressive elements to be perceivably more expressive than the version of the output without expressive parameters. The results further suggest that more experimentation may be required to conclude the efficacy of the framework employed in relation to using randomly selected parameter values for the expressive elements. This thesis presents the scope, context, implementation details and results of the work, suggesting future improvements.

Page generated in 0.0691 seconds