• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 6
  • 6
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 47
  • 9
  • 7
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Development of fast magnetic resonance imaging methods for investigation of the brain

Grieve, Stuart Michael 2000 (has links)
No description available.
2

Estratégia de manutenção em uma oficina de cilindros de laminação de aços longos

Monteiro, Guilherme Arthur Brunet 29 July 2013 (has links)
Submitted by Irene Nascimento (irene.kessia@ufpe.br) on 2015-03-12T16:45:10Z No. of bitstreams: 2 DISSERTAÇÃO Guilherme Arthur Brunet Monteiro.pdf: 3391661 bytes, checksum: f43864a0c429d64328086345fa921f69 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Made available in DSpace on 2015-03-12T16:45:10Z (GMT). No. of bitstreams: 2 DISSERTAÇÃO Guilherme Arthur Brunet Monteiro.pdf: 3391661 bytes, checksum: f43864a0c429d64328086345fa921f69 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2013-07-29 O mercado siderúrgico como um todo sofre constantes modificações em função de novos entrantes, oscilação do mercado e sob um contexto extremamente competitivo, produtores da indústria do aço seguem um caminho árduo na busca incessante por custos competitivos globalmente. Esta busca incessante pela redução de custos provoca a revisão de padrões e conceitos sobre o negócio, fazendo com que surjam idéias e novas formas de se fazer o que já é feito da mesma forma há muito tempo. Esta dissertação apresenta a aplicação de uma metodologia baseada em conceitos de manutenção aplicado a uma Oficina de Cilindros em uma Laminação de aços longos. Sob uma atuação de montagem, desmontagem, calibração e ajustes operacionais, uma Oficina de Cilindros apresenta muita interação com a recuperação e substituição de itens, com forte interação baseada em inspeções operacionais. O modelo proposto aborda estas inspeções, deixando claros a freqüência, parâmetros e seqüenciamento da atividade, além de empregar a manutenção preventiva nos conjuntos específicos, tudo com base em dados históricos, obtidos com snapshot, analisando o comportamento das falhas e quebras, permitindo decidir o tipo de intervenção, dividida em tecnologia, metodologia, periodicidade ou freqüência, sendo essas duas últimas obtidas com o conceito de delay time analysis.
3

Serializable Isolation for Snapshot Databases

Cahill, Michael James 2009 (has links)
PhD Many popular database management systems implement a multiversion concurrency control algorithm called snapshot isolation rather than providing full serializability based on locking. There are well-known anomalies permitted by snapshot isolation that can lead to violations of data consistency by interleaving transactions that would maintain consistency if run serially. Until now, the only way to prevent these anomalies was to modify the applications by introducing explicit locking or artificial update conflicts, following careful analysis of conflicts between all pairs of transactions. This thesis describes a modification to the concurrency control algorithm of a database management system that automatically detects and prevents snapshot isolation anomalies at runtime for arbitrary applications, thus providing serializable isolation. The new algorithm preserves the properties that make snapshot isolation attractive, including that readers do not block writers and vice versa. An implementation of the algorithm in a relational database management system is described, along with a benchmark and performance study, showing that the throughput approaches that of snapshot isolation in most cases.
4

Serializable Isolation for Snapshot Databases

Cahill, Michael James 2009 (has links)
PhD Many popular database management systems implement a multiversion concurrency control algorithm called snapshot isolation rather than providing full serializability based on locking. There are well-known anomalies permitted by snapshot isolation that can lead to violations of data consistency by interleaving transactions that would maintain consistency if run serially. Until now, the only way to prevent these anomalies was to modify the applications by introducing explicit locking or artificial update conflicts, following careful analysis of conflicts between all pairs of transactions. This thesis describes a modification to the concurrency control algorithm of a database management system that automatically detects and prevents snapshot isolation anomalies at runtime for arbitrary applications, thus providing serializable isolation. The new algorithm preserves the properties that make snapshot isolation attractive, including that readers do not block writers and vice versa. An implementation of the algorithm in a relational database management system is described, along with a benchmark and performance study, showing that the throughput approaches that of snapshot isolation in most cases.
5

Multi-Master Replication for Snapshot Isolation Databases

Chairunnanda, Prima 2013 (has links)
Lazy replication with snapshot isolation (SI) has emerged as a popular choice for distributed databases. However, lazy replication requires the execution of update transactions at one (master) site so that it is relatively easy for a total SI order to be determined for consistent installation of updates in the lazily replicated system. We propose a set of techniques that support update transaction execution over multiple partitioned sites, thereby allowing the master to scale. Our techniques determine a total SI order for update transactions over multiple master sites without requiring global coordination in the distributed system, and ensure that updates are installed in this order at all sites to provide consistent and scalable replication with SI. We have built our techniques into PostgreSQL and demonstrate their effectiveness through experimental evaluation.
6

Molecular methods for genotyping selected detoxification and DNA repair enzymes / J. Labuschagne

Labuschagne, Jeanine 2010 (has links)
The emerging field of personalized medicine and the prediction of side effects experienced due to pharmaceutical drugs is being studied intensively in the post genomic era. The molecular basis of inheritance and disease susceptibility is being unravelled, especially through the use of rapidly evolving new technologies. This in turn facilitates analyses of individual variations in the whole genome of both single subjects and large groups of subjects. Genetic variation is a common occurrence and although most genetic variations do not have any apparent effect on the gene product some do exhibit effects, such as an altered ability to detoxify xenobiotics. The human body has a highly effective detoxification system that detoxifies and excretes endogenous as well as exogenous toxins. Numerous studies have proved that specific genetic variations have an influence on the efficacy of the metabolism of pharmaceutical drugs and consequently the dosage administered. The primary aim of this project was the local implementation and assessment of two different genotyping approaches namely: the Applied Biosystems SNaPshot technique and Affymetrix DMET microarray. A secondary aim was to investigate if links could be found between the genetic data and the biochemical detoxification profile of participants. I investigated the approaches and gained insight into which method would be better for specific local applications, taking into consideration the robustness and ease of implementation as well as cost effectiveness in terms of data generated. The final study cohort comprised of 18 participants whose detoxification profiles were known. Genotyping was performed using the DMET microarray and SNaPshot techniques. The SNaPshot technique was used to genotype 11 SNPs relating to DNA repair and detoxification and was performed locally. Each DMET microarray delivers significantly more data in that it genotypes 1931 genetic markers relating to drug metabolism and transport. Due to the absence of a local service supplier, the DMET - microarrays were outsourced to DNALink in South Korea. DNALink generated raw data which was analysed locally. I experienced many problems with the implementation of the SNaPshot technique. Numerous avenues of troubleshooting were explored with varying degrees of success. I concluded that SNaPshot technology is not the best suited approach for genotyping. Data obtained from the DMET microarray was fed into the DMET console software to obtain genotypes and subsequently analysed with the help of the NWU statistical consultation services. Two approaches were followed: firstly, clustering the data and, secondly, a targeted gene approach. Neither of the two methods was able to establish a relationship between the DMET genotyping data and the detoxification profiling. For future studies to successfully correlate SNPs or SNP groups and a specific detoxification profile, two key issues should be addressed: i) The procedure for determining the detoxification profile following substrate loading should be further refined by more frequent sampling after substrate loading. ii) The number of participants should be increased to provide statistical power that will enable a true representation of the particular genetic markers in the specific population. The statistical analyses, such as latent class analyses to cluster the participants will also be of much more use for data analyses and interpretation if the study is not underpowered. Thesis (M.Sc. (Biochemistry))--North-West University, Potchefstroom Campus, 2011.
7

Molecular methods for genotyping selected detoxification and DNA repair enzymes / J. Labuschagne

Labuschagne, Jeanine 2010 (has links)
The emerging field of personalized medicine and the prediction of side effects experienced due to pharmaceutical drugs is being studied intensively in the post genomic era. The molecular basis of inheritance and disease susceptibility is being unravelled, especially through the use of rapidly evolving new technologies. This in turn facilitates analyses of individual variations in the whole genome of both single subjects and large groups of subjects. Genetic variation is a common occurrence and although most genetic variations do not have any apparent effect on the gene product some do exhibit effects, such as an altered ability to detoxify xenobiotics. The human body has a highly effective detoxification system that detoxifies and excretes endogenous as well as exogenous toxins. Numerous studies have proved that specific genetic variations have an influence on the efficacy of the metabolism of pharmaceutical drugs and consequently the dosage administered. The primary aim of this project was the local implementation and assessment of two different genotyping approaches namely: the Applied Biosystems SNaPshot technique and Affymetrix DMET microarray. A secondary aim was to investigate if links could be found between the genetic data and the biochemical detoxification profile of participants. I investigated the approaches and gained insight into which method would be better for specific local applications, taking into consideration the robustness and ease of implementation as well as cost effectiveness in terms of data generated. The final study cohort comprised of 18 participants whose detoxification profiles were known. Genotyping was performed using the DMET microarray and SNaPshot techniques. The SNaPshot technique was used to genotype 11 SNPs relating to DNA repair and detoxification and was performed locally. Each DMET microarray delivers significantly more data in that it genotypes 1931 genetic markers relating to drug metabolism and transport. Due to the absence of a local service supplier, the DMET - microarrays were outsourced to DNALink in South Korea. DNALink generated raw data which was analysed locally. I experienced many problems with the implementation of the SNaPshot technique. Numerous avenues of troubleshooting were explored with varying degrees of success. I concluded that SNaPshot technology is not the best suited approach for genotyping. Data obtained from the DMET microarray was fed into the DMET console software to obtain genotypes and subsequently analysed with the help of the NWU statistical consultation services. Two approaches were followed: firstly, clustering the data and, secondly, a targeted gene approach. Neither of the two methods was able to establish a relationship between the DMET genotyping data and the detoxification profiling. For future studies to successfully correlate SNPs or SNP groups and a specific detoxification profile, two key issues should be addressed: i) The procedure for determining the detoxification profile following substrate loading should be further refined by more frequent sampling after substrate loading. ii) The number of participants should be increased to provide statistical power that will enable a true representation of the particular genetic markers in the specific population. The statistical analyses, such as latent class analyses to cluster the participants will also be of much more use for data analyses and interpretation if the study is not underpowered. Thesis (M.Sc. (Biochemistry))--North-West University, Potchefstroom Campus, 2011.
8

Estudo da estrutura e filogenia da população do Rio de Janeiro através de SNPs do cromossomo Y

Santos, Simone Teixeira Bonecker dos 2010 (has links)
Submitted by Tatiana Silva (tsilva@icict.fiocruz.br) on 2013-02-08T13:08:41Z No. of bitstreams: 1 simone_t_b_santos_ioc_bcm_0006_2010.pdf: 2551865 bytes, checksum: 53317a265fddefc98babc5e7a0074acb (MD5) Made available in DSpace on 2013-02-08T13:08:41Z (GMT). No. of bitstreams: 1 simone_t_b_santos_ioc_bcm_0006_2010.pdf: 2551865 bytes, checksum: 53317a265fddefc98babc5e7a0074acb (MD5) Previous issue date: 2010 Fundação Oswaldo Cruz.Instituto Oswaldo Cruz. Rio de janeiro, RJ, Brasil A população brasileira é considerada miscigenada, derivada de um processo relativamente recorrente e recente. Aqui viviam milhões de indígenas quando começou o processo colonizatório envolvendo integrantes europeus, principalmente portugueses do sexo masculino, tornando comum o acasalamento entre homens europeus e mulheres indígenas, começando assim a heterogeneidade étnica encontrada em nossa população. Posteriormente com a chegada dos escravos, durante o ciclo econômico da cana-de-açúcar, começou a ocorrer relacionamentos entre europeus e africanas. Basicamente, trata-se de uma população tri-híbrida que atualmente apresenta em sua composição outros grupos, entre eles: italianos, espanhóis, sírios, libaneses e japoneses. Para o melhor entendimento das raízes filogenéticas brasileiras, foram utilizados neste estudo marcadores bi-alélicos da região não recombinante do cromossomo Y. O objetivo foi analisar como esses grupos heterogêneos contribuíram para o pool genético de origem paterna encontrado na população masculina do Rio de Janeiro, e assim enriquecer os conhecimentos acerca dos movimentos migratórios no processo de estruturação desta população. Foram analisados, através do minissequenciamento multiplex, 13 polimorfismos de base única (SNPs) e foi possível a identificação de nove haplogrupos e quatro sub-haplogrupos, em uma amostra de 200 indivíduos não aparentados e residentes do Estado do Rio de Janeiro, escolhidos aleatoriamente entre participantes de estudos de paternidade da Defensoria Pública do Rio de Janeiro. Dos haplogrupos analisados, somente o R1a, não foi observado em nossa população. O haplogrupo mais representativo foi o de origem européia, o R1b1, com 51%, enquanto o menos representativo, com 1% foi o Q1a3a, encontrado entre os nativos americanos. Cerca de 85% dos cromossomos Y analisados são de origem européia; 10,5% de africanos e 1% de ameríndios, e o restante são de origem indefinida. Ao comparamos com dados da literatura nossa população mostrou-se semelhante a população branca de Porto Alegre e nenhuma diferença significativa foi encontrada entre o pool gênico da população masculina do Rio de Janeiro com a portuguesa. Os resultados aqui encontrados corroboram os dados históricos da fundação da população do Rio de Janeiro durante o século XVI, período onde foi observada uma significante redução da população ameríndia, com importante contribuição demográfica vinda da região Subsaariana da África e Europa, principalmente de portugueses. Tendo em vista o alto grau de miscigenação da nossa população e os avanços na medicina personalizada, estudos sobre a estrutura genética humana têm fundamental implicação no entendimento na evolução e no impacto em doenças humanas, uma vez que para esta abordagem, a coloração da pele é um preditor não confiável de ancestralidade étnica do indivíduo. The Brazilian population is highly admixture, a relatively recurrent and recent process. Millions of indigenous people had been living here when the colonization process began, initially involving mainly Portuguese men. The immigration of European women during the first centuries was insignificant, making common the marriage between European men with indigenous women; hence, starting the ethnic heterogeneity found in our population nowadays. Subsequently, with the arrival of slaves during the economic cycle of sugarcane, began the relationships between Europeans and Africans. Basically, it is a tri-hybrid population with contributions from other groups, such as Italians, Germans, Syrians, Lebanese and Japanese. For a better understanding of Brazilian phylogenetic roots, biallelic markers of nonrecombining region of the Y chromosome were used in this study. The goal was to analyze how these heterogeneous groups contributed to the genetic pool found present-day in the population of Rio de Janeiro, and thus contribute to the understanding of migratory movements in the process of structuring this population. We analyzed, through minisequencing multiplex, 13 single nucleotide polymorphisms (SNPs) and through those it was able to identify nine haplogrupos and four subhaplogrupos, in a sample of 200 unrelated individuals, residents of the State of Rio de Janeiro, chosen randomly between participants from studies of fatherhood. Of the haplogrupos examined, only the R1a, has not been observed in our population. The more representative haplogroup was from European origin, the R1b1, with 51%, while the less representative, with 1% was the Q1a3a, found among Native Amerindians. 85% of Y chromosomes analyzed were from Europeans; 10.5% from Africans and 1% of Native Amerindians, and the rest have not had their origin defined. In this study sample, the vast majority of Y-chromosomes proved to be of European origin. Indeed, there were no significant differences when the haplogroup frequencies in Brazil and Portugal were compared by means of an exact test of population differentiation. These results corroborate historical data of the foundation of the population of Rio de Janeiro during the 16th century, a period where it was observed a significant reduction of Amerindian population was observed with important contribution from the Sub-Saharan region of Africa and Europe, particularly the Portuguese. In view of the high degree of admixture of oBrazilian population and advances in medicine, customized research on human genetic structure have fundamental implication in understanding the evolution and impact on human diseases, since for this approach, the skin color is an unreliable ancestry predictor of individual ethnic
9

Ensuring Serializable Executions with Snapshot Isolation DBMS

Alomari, Mohammad 2009 (has links)
Doctor of Philosophy(PhD) Snapshot Isolation (SI) is a multiversion concurrency control that has been implemented by open source and commercial database systems such as PostgreSQL and Oracle. The main feature of SI is that a read operation does not block a write operation and vice versa, which allows higher degree of concurrency than traditional two-phase locking. SI prevents many anomalies that appear in other isolation levels, but it still can result in non-serializable execution, in which database integrity constraints can be violated. Several techniques have been proposed to ensure serializable execution with engines running SI; these techniques are based on modifying the applications by introducing conflicting SQL statements. However, with each of these techniques the DBA has to make a difficult choice among possible transactions to modify. This thesis helps the DBA’s to choose between these different techniques and choices by understanding how the choices affect system performance. It also proposes a novel technique called ’External Lock Manager’ (ELM) which introduces conflicts in a separate lock-manager object so that every execution will be serializable. We build a prototype system for ELM and we run experiments to demonstrate the robustness of the new technique compare to the previous techniques. Experiments show that modifying the application code for some transactions has a high impact on performance for some choices, which makes it very hard for DBA’s to choose wisely. However, ELM has peak performance which is similar to SI, no matter which transactions are chosen for modification. Thus we say that ELM is a robust technique for ensure serializable execution.
10

Ensuring Serializable Executions with Snapshot Isolation DBMS

Alomari, Mohammad 2009 (has links)
Doctor of Philosophy(PhD) Snapshot Isolation (SI) is a multiversion concurrency control that has been implemented by open source and commercial database systems such as PostgreSQL and Oracle. The main feature of SI is that a read operation does not block a write operation and vice versa, which allows higher degree of concurrency than traditional two-phase locking. SI prevents many anomalies that appear in other isolation levels, but it still can result in non-serializable execution, in which database integrity constraints can be violated. Several techniques have been proposed to ensure serializable execution with engines running SI; these techniques are based on modifying the applications by introducing conflicting SQL statements. However, with each of these techniques the DBA has to make a difficult choice among possible transactions to modify. This thesis helps the DBA’s to choose between these different techniques and choices by understanding how the choices affect system performance. It also proposes a novel technique called ’External Lock Manager’ (ELM) which introduces conflicts in a separate lock-manager object so that every execution will be serializable. We build a prototype system for ELM and we run experiments to demonstrate the robustness of the new technique compare to the previous techniques. Experiments show that modifying the application code for some transactions has a high impact on performance for some choices, which makes it very hard for DBA’s to choose wisely. However, ELM has peak performance which is similar to SI, no matter which transactions are chosen for modification. Thus we say that ELM is a robust technique for ensure serializable execution.

Page generated in 0.0346 seconds