• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • 1
  • Tagged with
  • 7
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The effects of zinc sulfate on ethyl glucuronide immunoassay urine testing

Cawley, Shanna Marie 17 June 2016 (has links)
Published research in the Journal of Analytical Toxicology and the American Society for Clinical Pathology has confirmed that the presence of Zinc Sulfate in adulterated urine samples can influence the testing results using EMIT and ELISA immunoassay testing when testing for Cannabinoids (THC), Cocaine (Benzoylecgonine), Methamphetamines, Opiates (Morphine, Methadone, and Propoxyphene), Phencyclidine (PCP), and Ethanol (Alcohol Dehydrogenase). This research included adding Zinc Sulfate directly to urine samples. In 2006, the Substance Abuse and Mental Health Service Administration (SAMHSA) released an advisory that the use of Ethyl glucuronide (EtG) as a new biomarker as an indicator for the past-use of alcohol was promising and warranted more research. Ethyl glucuronide is a direct metabolite of the biotransformation of ethanol in the human body. This compound is excreted in urine and can be used as a specific biomarker for the ingestion of alcohol. Because EtG is only produced when ethanol is metabolized, there are no false positives due to fermentation and a much longer detection window exists for its detection. Scientific literature states that EtG can be present in urine long after ethanol has been eliminated. Testing for EtG is commonly referred to as the “80 hour test” for the ability of EtG to be measured up to 80 hours after consuming alcohol. It was hypothesized that if the presence of Zinc Sulfate added to urine falsely reduced urine alcohol level when measuring for Alcohol Dehydrogenase enzyme, will the presence of Zinc Sulfate added to SurineTM falsely reduce the urine alcohol level when measuring for EtG? Since it is very likely that EtG would still be present in the body after ethanol has been eliminated, samples contained either no ethanol or 5% (5g/dL) of ethanol. Samples were spiked at 10mg/mL, 15mg/mL or contained 0mg/mL of Zinc Sulfate. Additionally, duration testing was conducted to see if there was any observed differences between testing the samples fresh and then after a one week duration in a refrigerator and brought to room temperature prior to testing. Two different immunoassay EtG tests were used to perform the analysis. It was concluded that Zinc Sulfate directly added to the sample affected one of the immunoassay test regardless of whether EtG or ethanol were present, by fading the Test and Control regions. Additionally, it is concluded that SurineTM samples containing Zinc Sulfate could easily be distinguished from samples free of Zinc Sulfate because of the presence of a white cloudy precipitate.
2

Experimental study of density fluctuations in the STOR-M tokamak by small-angle microwave scattering

Livingstone, Stephen 27 January 2006
Density fluctuations in high temperature fusion plasmas have been a central challenge to the development of fusion power. They are the cause of excessive anomalous losses from the plasma and are still not fully understood. A microwave scattering experiment is performed on the Saskatchewan Torus-Modified (STOR-M) tokamak for the first time to study these density fluctuations with wave-numbers in the range <b><i>k</i></b> = 5 /cm to 10 /cm. The fluctuations are found to follow <i>k¦Ñ<sub>s</sub></i> scaling consistent with ion drift waves; signatures of the electron temperature gradient (ETG) mode connected with anomalous electron losses are not detected. The fluctuation level in the STOR-M is measured to be <i>n<sub>tilda</sub>/n</i> ¡Ö 0.1 at a mean perpendicular wave-number of <b><i>k</b><sub>perp</sub></b></i> ¡Ö 7 /cm and is reported for the first time. The fluctuation levels are inversely proportional to the energy confinement time suggesting that these fluctuations are driving anomalous particle and energy losses from the STOR-M. The system is now fully operational and this work paves the way for future experiments with this equipment.
3

Experimental study of density fluctuations in the STOR-M tokamak by small-angle microwave scattering

Livingstone, Stephen 27 January 2006 (has links)
Density fluctuations in high temperature fusion plasmas have been a central challenge to the development of fusion power. They are the cause of excessive anomalous losses from the plasma and are still not fully understood. A microwave scattering experiment is performed on the Saskatchewan Torus-Modified (STOR-M) tokamak for the first time to study these density fluctuations with wave-numbers in the range <b><i>k</i></b> = 5 /cm to 10 /cm. The fluctuations are found to follow <i>k¦Ñ<sub>s</sub></i> scaling consistent with ion drift waves; signatures of the electron temperature gradient (ETG) mode connected with anomalous electron losses are not detected. The fluctuation level in the STOR-M is measured to be <i>n<sub>tilda</sub>/n</i> ¡Ö 0.1 at a mean perpendicular wave-number of <b><i>k</b><sub>perp</sub></b></i> ¡Ö 7 /cm and is reported for the first time. The fluctuation levels are inversely proportional to the energy confinement time suggesting that these fluctuations are driving anomalous particle and energy losses from the STOR-M. The system is now fully operational and this work paves the way for future experiments with this equipment.
4

A search for strong gravitational lenses in early-type galaxies using UKIDSS

Husnindriani, Prahesti January 2015 (has links)
This work is focused on a search for strong gravitational lenses in early-type galaxies (ETGs). The total number of samples is 4,706 galaxies encompassing a magnitude range 15.0 < i < 18.0 and colour 3.5 < (u-r) < 5.0. Two databases were employed as the source of K-band images (UKIDSS Large Area Survey) and g, r, i images (SDSS). All samples were fitted to a Sersic component and automatically processed using GALFIT (Peng et al. 2002; Peng et al. 2010) inside a Python script (Appendix A). The first classification generated 259 galaxies which are seen as single galaxies in their K-band images. These galaxies were then reclassified based on image contouring in g, r, i, and K filters and therefore resulted in three categories of samples: Sample A (99 galaxies), Sample B (96 galaxies), and Sample C (64 galaxies).
5

Transport de chaleur électronique dans un tokamak par simulation numérique directe d'une turbulence de petite échelle

Labit, Benoit 24 October 2002 (has links) (PDF)
La compréhension de l'état turbulent d'un plasma de fusion, responsable du faible temps de confinement observé, constitue un enjeu fondamental vers la production d'énergie par cette voie. Pour les machines les plus performantes, les tokamaks, les conductivités thermiques ionique et électronique mesurées sont du même ordre de grandeur. Les sources potentielles de la turbulence sont les forts gradients de température, de densité,... présents au coeur d'un plasma de tokamak. Si les pertes de chaleur par le canal ionique sont relativement bien comprises, l'origine du fort transport de chaleur électronique est quant à elle largement inconnue. En plus des fluctuations de vitesses électrostatiques, il existe des fluctuations de vitesses magnétiques, auxquelles des particules rapides sont particulièrement sensibles. Expérimentalement, le temps de confinement peut s'exprimer en fonction de paramètres non adimensionnels. Ces lois d'échelle sont encore trop imprécises, néanmoins de fortes dépendances en fonction du rapport de la pression cinétique à la pression magnétique, β et du rayon de Larmor normalisé, ρ* sont prédites.<br /><br />La thèse proposée ici cherche à déterminer la pertinence d'un modèle fluide non linéaire, électromagnétique, tridimensionnel, basé sur une instabilité particulière pour décrire les pertes de chaleur par le canal électronique et de déterminer les dépendances du transport turbulent associé en fonction de paramètres adimensionnels, dont β et ρ*. L'instabilité choisie est une instabilité d'échange générée par le gradient de température électronique (Electron Temperature Gradient (ETG) driven turbulence en anglais). Ce modèle non linéaire est construit à partir des équations de Braginskii. Le code de simulation développé est global au sens qu'un flux de chaleur entrant est imposé, laissant les gradients libres d'évoluer.<br /><br />A partir des simulations non linéaires, nous avons pu mettre en évidence trois caractéristiques principales pour le modèle ETG fluide: le transport de chaleur turbulente est essentiellement électrostatique; les fluctuations de potentiel et de pression forment des structures radialement allongées; le niveau de transport observé est beaucoup plus faible que celui mesuré expérimentalement.<br /><br />L'étude de la dépendance du transport de chaleur en fonction du rapport de la pression cinétique à la pression magnétique a montré un faible impact de ce paramètre mettant ainsi en défaut la loi empirique d'Ohkawa. En revanche, il a été montré sans ambiguïté le rôle important du rayon de Larmor électronique normalisé dans le tranport de chaleur: le temps de confinement est inversement proportionnel à ce paramètre. Enfin, la faible dépendance du transport de chaleur turbulent en fonction du cisaillement magnétique et de l'inverse du rapport d'aspect a été mise en évidence.<br /><br />Bien que le niveau de transport observé dans les simulations soit plus faible que celui mesuré expérimentalement, nous avons tenté une confrontation directe avec un choc de Tore Supra. Ce tokamak est particulièrement bien désigné pour étudier les pertes de chaleur électronique. En conservant la plupart des paramètres d'un choc bien référencé de Tore Supra, la simulation non linéaire obtenue donne un seuil en gradient de température proche de la valeur expérimentale. Le niveau de transport observé est plus faible d'un facteur cinquante environ que le transport mesuré. Un paramètre important qui n'a pu être conservé est le rayon de Larmor normalisé.<br /><br />La limitation en ρ* devra être franchie afin de confirmer ces résultats. Enfin une rigoureuse confrontation avec des simulations girocinétiques permettra de disqualifier ou non l'instabilité ETG pour rendre compte des pertes de chaleur observées.<br /><br />Mots-clés: fusion thermonucléaire, tokamak, plasma, turbulence ETG, simulations numériques
6

New developments in analytical toxicology for the investigation of drug facilitated crime

Paul, Richard January 2007 (has links)
Drug facilitated assault (DFA) is an increasing problem in the UK. The crime often occurs through the surreptitious administration of a drug into a victims drink, rendering the victim unable to resist the assault. The detection of these drugs in a biological specimen from the victim is one of the most challenging facets of forensic chemistry. Drug concentrations can be very low, as often only a single dose is administered, and the pharmacodynamics of commonly employed drugs further hinders the testing process. The research presented in this work shows the development of several new assays for the detection of flunitrazepam, gamma-hydroxybutyrate (GHB) and ethyl glucuronide (EtG) in a variety of biological matrices. New methods of drug testing in blood and urine are demonstrated, as well as interesting developments in the field of hair testing. Using hair to detect drug exposure allows a much wider window of detection than the more traditional matrices of blood and urine. New methods are presented in this work using gas chromatography-tandem mass spectrometry (GCMS/MS) to detect drugs in hair. Validation data is presented along with the results of authentic DFA testing. All aspects of the drug testing procedure have been evaluated, from new extraction techniques utilising water instead of solvents, to novel clean up stages involving the unique combination of SFE and SPME. Several confirmation techniques are explored including single quadrupole, triple quadrupole and ion trap mass spectrometry. In addition to developing assays for DFA cases, the versatility of this type of analytical chemistry is explored in two population studies. The first study evaluates alcohol consumption between two groups; drugs users and non drug users in medico-legal cases. There is an anecdotal belief amongst drug clinic staff that alcohol use is lower in drugs users than it is in non drug users. This study presents the first scientific confirmation of this belief through EtG (an alcohol metabolite) testing in hair of the two groups. The second study investigates whether there is a correlation between EtG and cocaethylene (a metabolite of cocaine only produced in the presence of alcohol) in cocaine users. Results f this study suggest that there is no positive correlation between the two compounds. The research presented in this thesis aims to further the analytical science surrounding FA investigation and provide accurate, sensitive and reliable methodology for drug esting in blood, urine and hair.
7

Reducing uncertainty in new product development

Higgins, Paul Anthony January 2008 (has links)
Research and Development engineering is at the corner stone of humanity’s evolution. It is perceived to be a systematic creative process which ultimately improves the living standard of a society through the creation of new applications and products. The commercial paradigm that governs project selection, resource allocation and market penetration prevails when the focus shifts from pure research to applied research. Furthermore, the road to success through commercialisation is difficult for most inventors, especially in a vast and isolated country such as Australia which is located a long way from wealthy and developed economies. While market leading products are considered unique, the actual process to achieve these products is essentially the same; progressing from an idea, through development to an outcome (if successful). Unfortunately, statistics indicate that only 3% of ‘ideas’ are significantly successful, 4% are moderately successful, and the remainder ‘evaporate’ in that form (Michael Quinn, Chairman, Innovation Capital Associates Pty Ltd). This study demonstrates and analyses two techniques developed by the author which reduce uncertainty in the engineering design and development phase of new product development and therefore increase the probability of a successful outcome. This study expands the existing knowledge of the engineering design and development stage in the new product development process and is couched in the identification of practical methods, which have been successfully used to develop new products by Australian Small Medium Enterprise (SME) Excel Technology Group Pty Ltd (ETG). Process theory is the term most commonly used to describe scientific study that identifies occurrences that result from a specified input state to an output state, thus detailing the process used to achieve an outcome. The thesis identifies relevant material and analyses recognised and established engineering processes utilised in developing new products. The literature identified that case studies are a particularly useful method for supporting problem-solving processes in settings where there are no clear answers or where problems are unstructured, as in New Product Development (NPD). This study describes, defines, and demonstrates the process of new product development within the context of historical product development and a ‘live’ case study associated with an Australian Government START grant awarded to Excel Technology Group in 2004 to assist in the development of an image-based vehicle detection product. This study proposes two techniques which reduce uncertainty and thereby improve the probability of a successful outcome. The first technique provides a predicted project development path or forward engineering plan which transforms the initial ‘fuzzy idea’ into a potential and achievable outcome. This process qualifies the ‘fuzzy idea’ as a potential, rationale or tangible outcome which is within the capability of the organisation. Additionally, this process proposes that a tangible or rationale idea can be deconstructed in reverse engineering process in order to create a forward engineering development plan. A detailed structured forward engineering plan reduces the uncertainty associated with new product development unknowns and therefore contributes to a successful outcome. This is described as the RETRO technique. The study recognises however that this claim requires qualification and proposes a second technique. The second technique proposes that a two dimensional spatial representation which has productivity and consumed resources as its axes, provides an effective means to qualify progress and expediently identify variation from the predicted plan. This spatial representation technique allows a quick response which in itself has a prediction attribute associated with directing the project back onto its predicted path. This process involves a coterminous comparison between the predicted development path and the evolving actual project development path. A consequence of this process is verification of progress or the application of informed, timely and quantified corrective action. This process also identifies the degree of success achieved in the engineering design and development phase of new product development where success is defined as achieving a predicted outcome. This spatial representation technique is referred to as NPD Mapping. The study demonstrates that these are useful techniques which aid SMEs in achieving successful new product outcomes because the technique are easily administered, measure and represent relevant development process related elements and functions, and enable expedient quantified responsive action when the evolving path varies from the predicted path. These techniques go beyond time line representations as represented in GANTT charts and PERT analysis, and represent the base variables of consumed resource and productivity/technical achievement in a manner that facilitates higher level interpretation of time, effort, degree of difficulty, and product complexity in order to facilitate informed decision making. This study presents, describes, analyses and demonstrates an SME focused engineering development technique, developed by the author, that produces a successful new product outcome which begins with a ‘fuzzy idea’ in the mind of the inventor and concludes with a successful new product outcome that is delivered on time and within budget. Further research on a wider range of SME organisations undertaking new product development is recommended.

Page generated in 0.0165 seconds