• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1398
  • 1009
  • 380
  • 88
  • 62
  • 59
  • 45
  • 38
  • 21
  • 19
  • 14
  • 12
  • 11
  • 8
  • 8
  • Tagged with
  • 3653
  • 1140
  • 591
  • 492
  • 382
  • 355
  • 299
  • 251
  • 249
  • 248
  • 229
  • 224
  • 217
  • 215
  • 209
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Determinação de endotoxina bacteriana (pirogênio) em radiofármacos pelo método  de formação de gel. Validação / Determination of bacterial endotoxin (pyrogen) in radiopharmaceuticals by the gel clot method. Validation.

Neuza Taeko Okasaki Fukumori 28 February 2008 (has links)
Antes do Ensaio do Lisado de Amebócitos do Limulus (LAL), a única forma de se avaliar a pirogenicidade em drogas parenterais e dispositivos médicos era o ensaio de pirogênio em coelhos da Farmacopéia Americana (USP). Especialmente para radiofármacos, o ensaio LAL é a escolha para a determinação de endotoxina bacteriana (pirogênio). O objetivo deste trabalho foi validar o método de formação de gel para alguns radiofármacos sem uma interferência mensurável. O guia do método LAL do Food and Drug Administration (FDA) define interferência como uma condição que causa uma diferença significativa entre os pontos finais de gelificação das séries de controle positivo da água e controle positivo do produto utilizando-se um endotoxina padrão. Os experimentos foram realizados de acordo com o teste de endotoxinas bacterianas da USP na m-iodobenzilguanidina-131I, nos radioisótopos Gálio-67 e Tálio-201, nos reagentes liofilizados DTPA, Fitato, GHA, SAH e Sn Coloidal. A Máxima Diluição Válida (MDV) foi calculada para cada produto com base na sua dose clínica e diluições seriadas abaixo da MDV foram avaliadas em duplicata para a detecção de interferências. A sensibilidade declarada do reagente de LAL foi de 0,125 UE mL-1 (Unidades de Endotoxina por mililitro). Para a validação, uma série de diluições foi feita utilizando-se padrão de endotoxina (PE) nas concentrações de 0,5 a 0,03 UE mL-1 para a confirmação da sensibilidade do reagente de LAL, em quadruplicata. A mesma série de diluições foi feita com o PE e o produto diluído 100 vezes em três lotes consecutivos de cada radiofármaco. Os produtos m-iodobenzilguanidina-131I, Gálio-67, Tálio-201, DTPA, SAH e Sn Coloidal foram compatíveis com o método no fator de diluição 1:100. Fitato e GHA apresentaram interferência no ensaio de formação de gel. Outras técnicas para determinar endotoxinas como o ensaio cromogênico (desenvolvimento de cor) e o turbidimétrico (desenvolvimento de turbidez) foram avaliadas para obter informações qualitativas e quantitativas sobre as concentrações de endotoxinas nas amostras. / Before the Limulus amebocyte lysate (LAL) test, the only available means of pirogenicity testing for parenteral drugs and medical devices was the United States Pharmacopoeia (USP) rabbit pyrogen test. Especially for radiopharmaceuticals, the LAL assay is the elective way to determine bacterial endotoxin. The aim of this work was to validate the gel clot method for some radiopharmaceuticals without measurable interference. The FDAs LALTest guideline defines interference as a condition that causes a significant difference between the endpoints of a positive water control and positive product control series using a standard endotoxin. Experiments were performed in accordance to the USP bacterial endotoxins test in the 131I- m-iodobenzylguanidine; the radioisotopes Gallium-67 and Thallium-201; the liophylized reagents DTPA, Phytate, GHA, HSA and Colloidal Tin. The Maximum Valid Dilution (MVD) was calculated for each product based upon the clinical dose of the material and a twofold serial dilution below the MVD was performed in duplicate to detect interferences. The labeled sensitivity of the used LAL reagent was 0.125 EU mL-1 (Endotoxin Units per milliliter). For validation, a dilution series was performed, a twofold dilution of control standard endotoxin (CSE) from 0.5 to 0.03 EU mL-1, to confirm the labeled sensitivity of the LAL reagent being tested in sterile and non pyrogenic water, in quadruplicate. The same dilution series was performed with the CSE and the product in the 1:100 dilution factor, in three consecutive batches of each radiopharmaceutical. The products 131I-m-iodobenzylguanidine, Gallium-67, Thallium-201, DTPA, HSA and Colloidal Tin were found compatible with the LAL test at a 1:100 dilution factor. Phytate and GHA showed some interference in the gel clot test. Other techniques to determine endotoxins as the chromogenic (color development) and the turbidimetric test (turbidity development), were also assessed to get valuable quantitative and qualitative information about the endotoxin concentration in samples.
92

Pretesting the revised version of the South African Substance Use Contextual Risk Instrument (SASUCRI)

Hendricks, Shadley January 2018 (has links)
Magister Psychologiae - MPsych / Substance use is a major problem in South Africa, particularly within the Western Cape. The problem of substance use is prominent amongst adolescents in low socio-economic-status communities, and these prevalence rates are increasing. Literature regarding the onset of substance use is often limited and inadequate. It is for this reason that the South African Substance Use Contextual Risk Instrument (SASUCRI) was developed and employed to assess factors which contribute to adolescent substance use. The SASUCRI is a measure of the individual and contextual factors associated with adolescent substance use. It was developed to be used in low socio-economic-status communities to identify adolescents at risk for substance use as well as communities in which these risk factors are present. The initial validation study reports on the validity evidence for this instrument. The initial study identified items to be rewritten to improve the validity of the instrument. It further recommended the inclusion of additional items to improve reliability in some sub-scales. This contributed to the current study. The purpose of this study was to pretest the new and revised items. The researcher pretested both the English and the Afrikaans’s revised sub-scales of the SASUCRI. The following sub-scales were pretested; “School as support” (6 items), “School as a stressor” (6 items), “Tolerance for soft drugs” (6 items), Hopelessness individual”(11 items) and “Hopelessness community” (5 items). The theoretical framework employed was the Multi-Component Approach. The framework guided the data collection, analysis and partially the discussion of the findings. The study was of a qualitative nature. Two schools were selected from low socio-economic status communities. The study had 32 high school learners who participated. There were 4 focus groups conducted in total.
93

Desenvolvimento e validação de bioensaio para determinação de ceftarolina em pó para solução injetável : estudo prelimiar de estabilidade

Mascarello Junior, Idamir José January 2017 (has links)
Neste trabalho, foram desenvolvidos e validados métodos analítico e microbiológico, bem como estudo preliminar de estabilidade, cinética de degradação e citotoxicidade da Ceftarolina Fosamila em pó para solução injetável, um antibiótico da classe das cefalosporinas de quinta geração, indicado para pneumonias adquiridas na comunidade e infecções graves, de pele e tecidos moles. A validação do ensaio microbiológico pelo método de difusão em ágar cilindros em placa, delineamento 3x3, apresentou resultados satisfatórios, como especificidade, linearidade na faixa de 2,0 - 8,0 μg/mL, precisão (109,42 %), exatidão (102,3 %) e robustez. Soluções de Cefatarolina Fosamila do produto acabado expostas à radiação UVC (254 nm) e à degradação térmica a 60 °C foram utilizadas para avaliar a especificidade do bioensaio. A robustez foi avaliada através da alteração da concentração do meio inoculado (0,8 e 1,2 %). O desenvolvimento e validação de método por CLAE foi avaliado através da especificidade, linearidade, precisão, exatidão e robustez. No método cromatográfico foi utilizado cromatógrafo à liquido de alta eficiência SHIMADZU com coluna Agilent® C18, fase móvel (água com trietilamina 1,0% pH 5,0:acetonitrila 87:13 v/v). O método apresentou-se específico, linear, no intervalo de 5,0 - 60,0 μg/mL, preciso (110,0 %), exato (100,68 %) e robusto. Os métodos microbiológico e cromatográfico validados foram comparados estatisticamente e verificou-se não haver diferença significativa entre eles quando comparados através do teste “t” de Student. No estudo preliminar de estabilidade constatou-se ser estável em hidrólise ácida (0,1 M) e luz UVA no período avaliado, e instável frente à degradação térmica (40 e 60 °C), oxidativa com peróxido de hidrogênio, básica em NaOH (0,1 M e 0,01 M) e luz UVC. As cinéticas de degradação frente à luz UVC e degradação térmica 60 °C mostraram que as amostras possuem cinética de degradação de ordem zero e de segunda ordem, respectivamente. O ensaio de citotoxicidade demonstrou não haver diferença entre a condição normal e a amostra submetida à degradação forçada, sugerindo que os possíveis produtos de degradação formados não alteraram o resultado. / In this work, analytical and microbiological methods were developed and validated, as well as a preliminary study of the stability, degradation kinetics and cytotoxicity to Ceftaroline Fosamil powder for injectable solution, this is a fifth generation cephalosporin antibiotic indicated for community-acquired pneumonia and severe infections of the skin and soft tissues. The validation of the microbial assay by diffusion method in 3x3 cylinder agar delineated showed satisfactory results in specificity, linearity in the range of 2.0 - 8.0 μg / mL, precision (109.42 %), accuracy (102.3 %) and robustness. The development and validation of the method by HPLC was evaluated through specificity, linearity, precision, accuracy and robustness. In the chromatographic method was used high performance liquid chromatograph from SHIMADZU with Agilent® C18 column, mobile phase (water with triethylamine 1.0 % pH 5.0: acetonitrile 87:13 v/v). The method was linear, specific in the range of 5.0 - 60.0 μg/mL, accurate (110.0 %), exact (100.68 %) and robust. The validated microbiological and chromatographic methods were compared statistically and there was no significant difference between them when compared through Student's t-test. In the preliminary stability study, it was found stable in acid hydrolysis (0.1M) and UVA light in the period evaluated, and instable against thermal degradation (40 and 60 °C), oxidative with hydrogen peroxide, basic in NaOH (0.1 M and 0.0 1M) and UVC light. Samples exposed in UVC light an thermal degradation at 60°C showed degradation kinetics following zero order and second order, respectively. The cytotoxicity assay showed no difference between the normal condition and the sample submitted to forced degradation, suggesting that the possible degradation products formed did not change the result.
94

Validation of Case-finding Algorithms Derived from Health Administrative Data for Identifying Neonatal Bacterial Sepsis

Yao, Chunhe 01 October 2019 (has links)
Objectives: The objectives of this thesis were to: 1) develop and validate a coding algorithm to identify true cases of neonatal bacterial sepsis, and 2) apply the algorithm to calculate incidence rates and estimate temporal trends of neonatal bacterial sepsis. Methods: For Objective 1, the reference cohorts were assembled among neonates born in 2012-2017 using patient-level health care encounter data. Any neonates who met both the Diagnostic Criterion Ⅰ (microbiological confirmation) and Criterion Ⅱ (sepsis-related antibiotic administration) were included in the true-positive cohort. Potential coding algorithms were developed based on different combinations of ICD-10-CA codes on the hospitalization discharge abstract. For Objective 2, the coding algorithm with the most optimal characteristics was applied to provincial data to calculate incidence rates in Ontario during 2003-2017. Recent temporal trends were estimated by Poisson regression analysis. Results: In Objective 1, since all true-positive cases identified were born at preterm gestation, the study population in Objective 2 was limited to preterm infants. The final coding algorithm selected had sensitivity of 75.3% (95% CI, 66.8%-83.7%), specificity of 98.2% (95% CI, 97.8%-98.6%) and PPV of 50.0% (95% CI, 42.1%-58.0%). Using this algorithm, the annual incidence declined over time from 50.2 (95% CI, 45.4-55.4) per 1000 preterm infants in 2003 to 27.5 (95% CI, 20.4-36.9) per 1000 preterm infants in 2017. The trend over time was statistically significant with P-value <0.0001. Significant variation in bacterial sepsis incidence rates was noted across infant sex and gestational age. Conclusion: The coding algorithm developed in this study could not accurately identify neonates with bacterial sepsis from within health administrative database using the data available to us now. For the purpose of demonstrating the application of the algorithm, we carried out Objective 2; however, it is important to cautiously interpret the provincial rates given the the poor performance of the case-finding algorithm.
95

Requirements-Oriented Methodology for Evaluating Ontologies

Yu, Jonathan, Jonathan.Yu@csiro.au January 2009 (has links)
Ontologies play key roles in many applications today. Therefore, whether using a newly-specified ontology or an existing ontology for use in its target application, it is important to determine the suitability of an ontology to the application at hand. This need is addressed by carrying out ontology evaluation, which determines qualities of an ontology using methodologies, criteria or measures. However, for addressing the ontology requirements from a given application, it is necessary to determine what the appropriate set of criteria and measures are. In this thesis, we propose a Requirements-Oriented Methodology for Evaluating Ontologies (ROMEO). ROMEO outlines a methodology for determining appropriate methods for ontology evaluation that incorporates a suite of existing ontology evaluation criteria and measures. ROMEO helps ontology engineers to determine relevant ontology evaluation measures for a given set of ontology requirements by linking these requirements to existing ontology evaluation measures through a set of questions. There are three main parts to ROMEO. First, ontology requirements are elicited from a given application and form the basis for an appropriate evaluation of ontologies. Second, appropriate questions are mapped to each ontology requirement. Third, relevant ontology evaluation measures are mapped to each of those questions. From the ontology requirements of an application, ROMEO is used to determine appropriate methods for ontology evaluation by mapping applicable questions to the requirements and mapping those questions to appropriate measures. In this thesis, we perform the ROMEO methodology to obtain appropriate ontology evaluation methods for ontology-driven applications through case studies of Lonely Planet and Wikipedia. Since the mappings determined by ROMEO are dependent on the analysis of the ontology engineer, the validation of these mappings is needed. As such, in addition to proposing the ROMEO methodology, a method for the empirical validation of ROMEO mappings is proposed in this thesis. We report on two empirical validation experiments that are carried out in controlled environments to examine the performance of the ontologies over a set of tasks. These tasks vary and are used to compare the performance of a set of ontologies in the respective experimental environment. The ontologies used vary on a specific ontology quality or measure being examined. Empirical validation experiments are conducted for two mappings between questions and their associated measures, which are drawn from case studies of Lonely Planet and Wikipedia. These validation experiments focus on mappings between questions and their measures. Furthermore, as these mappings are application-independent, they may be reusable in subsequent applications of the ROMEO methodology. Using a ROMEO mapping from the Lonely Planet case study, we validate a mapping of a coverage question to the F-measure. The validation experiment carried out for this mapping was inconclusive, thus requiring further analysis. Using a ROMEO mapping from the Wikipedia case study, we carry out a separate validation experiment examining a mapping between an intersectedness question and the tangledness measure. The results from this experiment showed the mapping to be valid. For future work, we propose additional validation experiments for mappings that have been identified between questions and measures.
96

A pattern-based approach to the specification and validation of web services interactions

Li, Zheng, n/a January 2007 (has links)
Web services are designed for composition and use by third parties through dynamic discovery. As such, the issue of interoperability between services is of great importance to ensure that the services can work together towards the overall application goals. In particular, the interaction protocols of a service need to be implemented and used properly so that the service composition can conduct itself in an orderly fashion. There have been significant research efforts in providing rich descriptions for Web services, which includes their behaviour properties. When describing the interaction process/protocols of a service, most of them adopt a procedural or programming style approach. We argue that this style of description for service interactions is not natural to publishing service behaviour properties from the viewpoint of facilitating third-party service composition and analysis. Especially when dealing with service with diverse behaviour, the limit of these procedural approaches become apparent. In this thesis, we introduce a lightweight, pattern/constraint-based declarative approach that better supports the specification and use of service interaction properties in the service description and composition process. This approach uses patterns to describe the interaction behaviour of a service as a set of constraints. As such, it supports the incremental description of a service's interaction behaviour from the service developer's perspective, and the easy understanding and analysis of the interaction properties from the service user's perspective. It has been incorporated into OWL-S for service developers to describe service interaction constraints. We also present a framework and the related tool support for monitoring and checking the conformance of the service's runtime interactions against its specified interaction properties, to test whether the service is used properly and whether the service fulfils its behavioural obligations. The tool involves interception of service interactions/messages, representation of interaction constraints using finite state automata and finite state machine, and conformance checking of service interactions against interaction constraints. As such, we provide a useful tool for validating the implementation and use of services regarding their interaction behaviour.
97

Reconstruction of 3D Neuronal Structures from Densely Packed Electron Microscopy Data Stacks

Yang, Huei-Fang 2011 August 1900 (has links)
The goal of fully decoding how the brain works requires a detailed wiring diagram of the brain network that reveals the complete connectivity matrix. Recent advances in high-throughput 3D electron microscopy (EM) image acquisition techniques have made it possible to obtain high-resolution 3D imaging data that allows researchers to follow axons and dendrites and to identify pre-synaptic and post-synaptic sites, enabling the reconstruction of detailed neural circuits of the nervous system at the level of synapses. However, these massive data sets pose unique challenges to structural reconstruction because the inevitable staining noise, incomplete boundaries, and inhomogeneous staining intensities increase difficulty of 3D reconstruction and visualization. In this dissertation, a new set of algorithms are provided for reconstruction of neuronal morphology from stacks of serial EM images. These algorithms include (1) segmentation algorithms for obtaining the full geometry of neural circuits, (2) interactive segmentation tools for manual correction of erroneous segmentations, and (3) a validation method for obtaining a topologically correct segmentation when a set of segmentation alternatives are available. Experimental results obtained by using EM images containing densely packed cells demonstrate that (1) the proposed segmentation methods can successfully reconstruct full anatomical structures from EM images, (2) the editing tools provide a way for the user to easily and quickly refine incorrect segmentations, (3) and the validation method is effective in combining multiple segmentation results. The algorithms presented in this dissertation are expected to contribute to the reconstruction of the connectome and to open new directions in the development of reconstruction methods.
98

Homeostatic Beliefs: Measurement and Future Applications

Burton, Caitlin 11 January 2010 (has links)
“Homeostatic beliefs” (HBs) denote a sense that one’s life path will remain stable in the long-term despite short-term disruptions. Two studies have been undertaken to explore whether HBs exist independent of other constructs, and to develop a scale with which to measure them. In Study 1, 158 undergraduate students completed a draft HB scale and theoretically related scales. Convergent and divergent validity were assessed with correlational and regression analyses: HBs are most strongly related to, but not redundant with, optimism, trait extraversion, and satisfaction with life. Using exploratory factor analysis, a six-item HB scale was derived. Study 2 is in progress, and will assess the construct validity of the HB scale by attempting to manipulate HBs to possibly influence individuals’ reactions to a mortality salience manipulation. We hypothesize that high HBs may buffer individuals from transient disrupting stimuli such as a mortality salience cue.
99

Genotype/Haplotype Tagging Methods and their Validation

Zhang, Jun 06 November 2007 (has links)
This study focuses how the MLR-tagging for statistical covering, i.e. either maximizing average R2 for certain number of requested tags or minimizing number of tags such that for any non-tag SNP there exists a highly correlated (squared correlation R2 > 0.8) tag SNP. We compare with tagger, a software for selecting tags in hapMap project. MLR-tagging needs less number of tags than tagger in all 6 cases of the given test sets except 2. Meanwhile, Biologists can detect or collect data only from a small set. So, this will bring a problem for scientists that the estimates accuracy of tag SNPs when constructing the complete human haplotype map. This study investigates how the MLR-tagging for statistically coverage performs under unbias study. The experiment results shows MLR-tagging still select small amount of SNPs very well even without observing the entire SNP in the sample.
100

Improved postmortem diagnosis of <i>taenia saginata</i> cysticercosis

Scandrett, William Bradley 15 August 2007
Bovine cysticercosis is a zoonotic disease for which cattle are the intermediate hosts of the human tapeworm <i>Taenia saginata</i>. Routine inspection measures are implemented in Canada by the Canadian Food Inspection Agency (CFIA), and similarly elsewhere, for the postmortem detection of larval parasite cysts (cysticerci) in beef destined for human consumption. Detection is based on the gross examination of traditional carcass predilection sites, although it is recognized that the parasite has no true predilection for a particular tissue or site. In order to evaluate the efficacy of the inspection protocol currently implemented in Canada, a study was undertaken to determine the distribution of <i>T. saginata</i> cysticerci in tissues of experimentally infected cattle. Forty-two cross-bred beef cattle were divided into five groups of 5-12 animals each and inoculated orally with either 10000, 5000, 1000, 100 or 10 <i>T. saginata</i> eggs obtained from cases of human taeniosis in Thailand. From 47 to 376 days post-inoculation (DPI), ten animals inoculated with 5000 eggs were killed and the carcasses partitioned into 31 tissue sites. These consisted of the traditionally inspected tissue sites of heart, masseter and pterygoid muscles, tongue, oesophagus, and diaphragm (membranous and crura); as well as non-traditional sites of lung, liver and 20 additional muscles or muscle groups. After the routine inspection for cysticerci of traditional tissue sites, tissues from all sites were each cut into approximately 0.5 cm thick slices and the total number of parasitic cysts and cyst density (cysts/g of tissue) were determined for each site. Traditional sites were similarly evaluated for the remaining 32 animals that were killed between 117 and 466 DPI. Sites were ranked based on cyst density. In the animals for which non-traditional sites were also evaluated, no sites had higher cyst densities than those traditionally inspected. When only traditional sites for all animals were compared, the heart ranked highest overall, although not significantly different from masseter, and was the most frequently affected site. The traditional site of oesophagus was among the poorest of all sites for detection of cysticerci. The heart was confirmed as the site of choice for detection of bovine cysticercosis based on high cyst density and frequency of infection. There was also enhanced visibility of parasite lesions in the heart due to the relatively early degeneration and resultant gross pathology that occurs in cardiac muscle. More thorough examination of the heart is recommended during post-mortem inspection for this parasite, particularly when examining animals from an infected herd. <p>Currently, confirmation by CFIA of suspect cysticerci recovered during meat inspection relies on gross, stereomicroscopic, or standard histological examination. Although degenerating cysticerci are more likely to be detected and submitted for diagnosis, they often cannot be definitively identified by these methods. A recently developed monoclonal antibody-based immunohistochemical (IHC) assay for post-mortem diagnosis of this parasite was optimized and standardized. The IHC method was compared to the currently used histological assay using 169 degenerated known-positive <i>T. saginata</i> cysticerci collected from the experimental infections in the first study and from field submissions, and known-negative specimens and lesions of various etiologies from non-infected cattle. The use of the IHC assay identified significantly more known-positive bovine cysticerci (91.7%) than the histological method (38.5%), and non-specifically reacted only with the other cestode species examined. Since <i>T. saginata</i> is the only larval cestode typically found in the muscle of cattle, this cross-reactivity is not significant and the IHC assay will be a useful tool for the identification of lesions caused by degenerated bovine cysticerci.<p>This research provided evidence to support changes to the current post-mortem inspection, detection and diagnostic procedures and will contribute to more effective and efficient control of bovine cysticercosis.

Page generated in 0.0281 seconds