• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 151
  • 26
  • 20
  • 10
  • 8
  • 5
  • 4
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 283
  • 283
  • 95
  • 46
  • 42
  • 33
  • 30
  • 27
  • 27
  • 26
  • 25
  • 24
  • 23
  • 22
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Evaluating reasoning heuristics for a hybrid theorem proving platform

Ackermann, Jacobus Gideon 06 1900 (has links)
Text in English with abstracts in English, Afrikaans and isiZulu / The formalisation of first-order logic and axiomatic set theory in the first half of the 20th century—along with the advent of the digital computer—paved the way for the development of automated theorem proving. In the 1950s, the automation of proof developed from proving elementary geometric problems and finding direct proofs for problems in Principia Mathematica by means of simple, human-oriented rules of inference. A major advance in the field of automated theorem proving occurred in 1965, with the formulation of the resolution inference mechanism. Today, powerful Satisfiability Modulo Theories (SMT) provers combine SAT solvers with sophisticated knowledge from various problem domains to prove increasingly complex theorems. The combinatorial explosion of the search space is viewed as one of the major challenges to progress in the field of automated theorem proving. Pioneers from the 1950s and 1960s have already identified the need for heuristics to guide the proof search effort. Despite theoretical advances in automated reasoning and technological advances in computing, the size of the search space remains problematic when increasingly complex proofs are attempted. Today, heuristics are still useful and necessary to discharge complex proof obligations. In 2000, a number of heuristics was developed to aid the resolution-based prover OTTER in finding proofs for set-theoretic problems. The applicability of these heuristics to next-generation theorem provers were evaluated in 2009. The provers Vampire and Gandalf required respectively 90% and 80% of the applicable OTTER heuristics. This dissertation investigates the applicability of the OTTER heuristics to theorem proving in the hybrid theorem proving environment Rodin—a system modelling tool suite for the Event-B formal method. We show that only 2 of the 10 applicable OTTER heuristics were useful when discharging proof obligations in Rodin. Even though we argue that the OTTER heuristics were largely ineffective when applied to Rodin proofs, heuristics were still needed when proof obligations could not be discharged automatically. Therefore, we propose a number of our own heuristics targeted at theorem proving in the Rodin tool suite. / Die formalisering van eerste-orde-logika en aksiomatiese versamelingsteorie in die eerste helfte van die 20ste eeu, tesame met die koms van die digitale rekenaar, het die weg vir die ontwikkeling van geoutomatiseerde bewysvoering gebaan. Die outomatisering van bewysvoering het in die 1950’s ontwikkel vanuit die bewys van elementêre meetkundige probleme en die opspoor van direkte bewyse vir probleme in Principia Mathematica deur middel van eenvoudige, mensgerigte inferensiereëls. Vooruitgang is in 1965 op die gebied van geoutomatiseerde bewysvoering gemaak toe die resolusie-inferensie-meganisme geformuleer is. Deesdae kombineer kragtige Satisfiability Modulo Theories (SMT) bewysvoerders SAT-oplossers met gesofistikeerde kennis vanuit verskeie probleemdomeine om steeds meer komplekse stellings te bewys. Die kombinatoriese ontploffing van die soekruimte kan beskou word as een van die grootste uitdagings vir verdere vooruitgang in die veld van geoutomatiseerde bewysvoering. Baanbrekers uit die 1950’s en 1960’s het reeds bepaal dat daar ’n behoefte is aan heuristieke om die soektog na bewyse te rig. Ten spyte van die teoretiese vooruitgang in outomatiese bewysvoering en die tegnologiese vooruitgang in die rekenaarbedryf, is die grootte van die soekruimte steeds problematies wanneer toenemend komplekse bewyse aangepak word. Teenswoordig is heuristieke steeds nuttig en noodsaaklik om komplekse bewysverpligtinge uit te voer. In 2000 is ’n aantal heuristieke ontwikkel om die resolusie-gebaseerde bewysvoerder OTTER te help om bewyse vir versamelingsteoretiese probleme te vind. Die toepaslikheid van hierdie heuristieke vir die volgende generasie bewysvoerders is in 2009 geëvalueer. Die bewysvoerders Vampire en Gandalf het onderskeidelik 90% en 80% van die toepaslike OTTER-heuristieke nodig gehad. Hierdie verhandeling ondersoek die toepaslikheid van die OTTER-heuristieke op bewysvoering in die hibriede bewysvoeringsomgewing Rodin—’n stelselmodelleringsuite vir die formele Event-B-metode. Ons toon dat slegs 2 van die 10 toepaslike OTTER-heuristieke van nut was vir die uitvoering van bewysverpligtinge in Rodin. Ons voer aan dat die OTTER-heuristieke grotendeels ondoeltreffend was toe dit op Rodin-bewyse toegepas is. Desnieteenstaande is heuristieke steeds nodig as bewysverpligtinge nie outomaties uitgevoer kon word nie. Daarom stel ons ’n aantal van ons eie heuristieke voor wat in die Rodin-suite aangewend kan word. / Ukwenziwa semthethweni kwe-first-order logic kanye ne-axiomatic set theory ngesigamu sokuqala sekhulunyaka lama-20—kanye nokufika kwekhompyutha esebenza ngobuxhakaxhaka bedijithali—kwavula indlela ebheke ekuthuthukisweni kwenqubo-kusebenza yokufakazela amathiyoremu ngekhomyutha. Ngeminyaka yawo-1950, ukuqinisekiswa kobufakazi kwasuselwa ekufakazelweni kwezinkinga zejiyomethri eziyisisekelo kanye nasekutholakaleni kobufakazi-ngqo bezinkinga eziphathelene ne-Principia Mathematica ngokuthi kusetshenziswe imithetho yokuqagula-sakucabangela elula, egxile kubantu. Impumelelo enkulu emkhakheni wokufakazela amathiyoremu ngekhompyutha yenzeka ngowe-1965, ngokwenziwa semthethweni kwe-resolution inference mechanism. Namuhla, abafakazeli abanohlonze bamathiyori abizwa nge-Satisfiability Modulo Theories (SMT) bahlanganisa ama-SAT solvers nolwazi lobungcweti oluvela kwizizinda zezinkinga ezihlukahlukene ukuze bakwazi ukufakazela amathiyoremu okungelula neze ukuwafakazela. Ukukhula ngesivinini kobunzima nobunkimbinkimbi benkinga esizindeni esithile kubonwa njengenye yezinselelo ezinkulu okudingeka ukuthi zixazululwe ukuze kube nenqubekela phambili ekufakazelweni kwamathiyoremu ngekhompyutha. Amavulandlela eminyaka yawo-1950 nawo-1960 asesihlonzile kakade isidingo sokuthi amahuristikhi (heuristics) kube yiwona ahola umzamo wokuthola ubufakazi. Nakuba ikhona impumelelo esiyenziwe kumathiyori ezokucabangela okujulile kusetshenziswa amakhompyutha kanye nempumelelo yobuchwepheshe bamakhompyutha, usayizi wesizinda usalokhu uyinkinga uma kwenziwa imizamo yokuthola ubufakazi obuyinkimbinkimbi futhi obunobunzima obukhudlwana. Namuhla imbala, amahuristikhi asewuziso futhi ayadingeka ekufezekiseni izibopho zobufakazi obuyinkimbinkimbi. Ngowezi-2000, kwathuthukiswa amahuristikhi amaningana impela ukuze kulekelelwe uhlelo-kusebenza olungumfakazeli osekelwe phezu kwesixazululo, olubizwa nge-OTTER, ekutholeni ubufakazi bama-set-theoretic problems. Ukusebenziseka kwalawa mahuristikhi kwizinhlelo-kusebenza ezingabafakazeli bamathiyoremu besimanjemanje kwahlolwa ngowezi-2009. Uhlelo-kusebenza olungumfakazeli, olubizwa nge-Vampire kanye nalolo olubizwa nge-Gandalf zadinga ama-90% kanye nama-80%, ngokulandelana kwazo, maqondana nama-OTTER heuristics afanelekile. Lolu cwaningo luphenya futhi lucubungule ukusebenziseka kwama-OTTER heuristics ekufakazelweni kwamathiyoremu esimweni esiyinhlanganisela sokufakazela amathiyoremu esibizwa nge-Rodin—okuyi-system modelling tool suite eqondene ne-Event-B formal method. Kulolu cwaningo siyabonisa ukuthi mabili kuphela kwayi-10 ama-OTTER heuristics aba wusizo ngenkathi kufezekiswa isibopho sobufakazi ku-Rodin. Nakuba sibeka umbono wokuthi esikhathini esiningi ama-OTTER heuristics awazange abe wusizo uma esetshenziswa kuma-Rodin proofs, amahuristikhi asadingeka ezimweni lapho izibopho zobufakazi zingazenzekelanga ngokwazo ngokulawulwa yizinhlelo-kusebenza zekhompyutha. Ngakho-ke, siphakamisa amahuristikhi ethu amaningana angasetshenziswa ekufakazeleni amathiyoremu ku-Rodin tool suite. / School of Computing / M. Sc. (Computer Science)
192

A second-order cybernetic explanation for the existence of network direct selling organisations as self-creating systems

Davis, Corne 18 August 2011 (has links)
Network Direct Selling Organisations (NDSOs) exist in more than 50 countries and have more than 74 million members. The most recent statistical information reveals that the vast majority of members do not earn significant income. Criticism of these organisations revolves around the ethicality of consumption, the commercialisation of personal relationships, and the exploitation of unrealistic expectations. This study aims to explore how communication creates networks that sustain an industry of this kind despite the improbability of its existence. The study commences with a description of NDSOs from historical, operational, tactical, and strategic perspectives. Given the broader context created by the global presence of this industry, cybernetics has been selected as a meta-theoretical perspective for the study of communication. The more recent development of second-order cybernetics and social autopoiesis are introduced to communication theory as a field. Niklas Luhmann‟s new social theory of communication is assessed and applied in relation to existing communication theory. New conceptual models are developed to explore communication as the unity of the synthesis of information, utterance, understanding, and expectations as selections that occur both consciously and unconsciously, intentionally and unintentionally. These models indicate the multiplexity of individual and social operationally closed, yet informationally open systems, and they are used here to provide a systemic and coherent alternative to orthodox communication approaches to the study of organisations. The study adopts a constructivist epistemological stance and propounds throughout the necessity of further interdisciplinary collaboration. The study concludes that individuals are composite unities of self-creating systems, and they co-create social systems by self-creating and co-creating meaning. Meaning is described as the continuous virtualisation and actualisation of potentialities that in turn coordinate individual and social systems‟ actions. A communication process flow model is created to provide a theoretical explanation for the existence of NDSOs as self-creating systems. The study aims to show that communication has arguably become the most pervasive discipline as a result of the globally interactive era. It is shown that second-order cybernetics and social autopoiesis raise several further questions to be explored within communication theory as a field. / Communication, first-order cybernetics, second-order cybernetics, Complexity and complex systems, autopoiesis, self-reference, recursivity, operational closure, system boundaries, Network Direct Selling Organisations / Communication / D. Litt. et Phil. (Communication)
193

3+1 Approach to Cosmological Perturbations : Deriving the First Order Scalar Perturbations of the Einstein Field Equations / Kosmologisk störningsräkning utifrån 3+1 formalismen : Härledning av första ordningens skalära störningar av Einsteins fältekvationer

Wilhelm, Söderkvist Vermelin January 2016 (has links)
Experimental data suggest that the universe is homogeneous and isotropic on sufficiently large scales. An exact solution of the Einstein field equations exists for a homogeneous and isotropic universe, also known as a Friedmann-Lemaître-Robertson-Walker (FLRW) universe. However, this model is only a first approximation since we know that, locally, the universe has anisotropic and inhomogeneous structures such as galaxies and clusters of galaxies. In order to successfully introduce inhomogeneities and anisotropies to the model one uses perturbative methods. In cosmological perturbations the FLRW universe is considered the zeroth order term in a perturbation expansion and perturbation theory is used to derive higher order terms which one tries to match with observations. In this thesis I present a review of the main concepts of general relativity, discuss the 3+1 formalism which gives us the Einstein field equations in a useful form for the perturbative analysis, and lastly, I derive the first order scalar perturbations of the Einstein field equations.
194

Advanced Reasoning about Dynamical Systems

Gu, Yilan 17 February 2011 (has links)
In this thesis, we study advanced reasoning about dynamical systems in a logical framework -- the situation calculus. In particular, we consider promoting the efficiency of reasoning about action in the situation calculus from three different aspects. First, we propose a modified situation calculus based on the two-variable predicate logic with counting quantifiers. We show that solving the projection and executability problems via regression in such language are decidable. We prove that generally these two problems are co-NExpTime-complete in the modified language. We also consider restricting the format of regressable formulas and basic action theories (BATs) further to gain better computational complexity for reasoning about action via regression. We mention possible applications to formalization of Semantic Web services. Then, we propose a hierarchical representation of actions based on the situation calculus to facilitate development, maintenance and elaboration of very large taxonomies of actions. We show that our axioms can be more succinct, while still using an extended regression operator to solve the projection problem. Moreover, such representation has significant computational advantages. For taxonomies of actions that can be represented as finitely branching trees, the regression operator can sometimes work exponentially faster with our theories than it works with the BATs current situation calculus. We also propose a general guideline on how a taxonomy of actions can be constructed from the given set of effect axioms. Finally, we extend the current situation calculus with the order-sorted logic. In the new formalism, we add sort theories to the usual initial theories to describe taxonomies of objects. We then investigate what is the well-sortness for BATs under such framework. We consider extending the current regression operator with well-sortness checking and unification techniques. With the modified regression, we gain computational efficiency by terminating the regression earlier when reasoning tasks are ill-sorted and by reducing the search spaces for well-sorted objects. We also study that the connection between the order-sorted situation calculus and the current situation calculus.
195

Development and evaluation of a solid oral dosage form for an artesunate and mefloquine drug combination / Abel Hermanus van der Watt

Van der Watt, Abel Hermanus January 2014 (has links)
Malaria affects about forty percent of the world’s population. Annually more than 1.5 million fatalities due to malaria occur and parasite resistance to existing antimalarial drugs such as mefloquine has already reached disturbingly high levels in South-East Asia and on the African continent. Consequently, there is a dire need for new drugs or formulations in the prophylaxis and treatment of malaria. Artesunate, an artemisinin derivative, represents a new category of antimalarials that is effective against drug-resistant Plasmodium falciparum strains and is of significance in the current antimalarial campaign. As formulating an ACT double fixed-dose combination is technically difficult, it is essential that fixed-dose combinations are shown to have satisfactory ingredient compatibility, stability, and dissolution rates similar to the separate oral dosage forms. Since the general deployment of a combination of artesunate and mefloquine in 1994, the cure rate increased again to almost 100% from 1998 onwards, and there has been a sustained decline in the incidence of Plasmodium falciparum malaria in the experimental studies (Nosten et al., 2000:297; WHO, 2010:17). However, the successful formulation of a solid oral dosage form and fixed dosage combination of artesunate and mefloquine remains both a market opportunity and a challenge. Artesunate and mefloquine both exhibited poor flow properties. Furthermore, different elimination half-lives, treatment dosages as well as solubility properties of artesunate and mefloquine required different formulation approaches. To substantiate the FDA’s pharmaceutical quality by design concept, the double fixed-dose combination of artesunate and mefloquine required strict preliminary formulation considerations regarding compatibility between excipients and between the APIs. Materials and process methods were only considered if theoretically and experimentally proved safe. Infrared absorption spectroscopy (IR) and X-ray powder diffraction (XRPD) data proved compatibility between ingredients and stability during the complete manufacturing process by a peak by peak correlation. Scanning Electron Micrographs (SEM) provided explanations for the inferior flow properties exhibited by the investigated APIs. Particle size analysis and SEM micrographs confirmed that the larger, rounder and more consistently sized particles of the granulated APIs contributed to improved flow under the specified testing conditions. A compressible mixture containing 615 mg of the APIs in accordance with the WHO recommendation of 25 mg/kg of mefloquine taken in two or three divided dosages, and 4 mg/kg/day for 3 days of artesunate for uncomplicated falciparum malaria was developed. Mini-tablets of artesunate and mefloquine were compressed separately and successfully with the required therapeutic dosages and complied with pharmacopoeial standards. Preformulation studies eventually led to a formula for a double fixed-dose combination and with the specific aim of delaying the release of artesunate due to its short half-life. A factorial design revealed the predominant factors contributing to the successful wet granulation of artesunate and mefloquine. A fractional factorial design identified the optimum factors and factor levels. The application of the granulation fluid (20% w/w) proved to be sufficient by a spraying method for both artesunate and mefloquine. A compatible acrylic polymer and coating agent for artesunate, Eudragit® L100 was employed to delay the release of approximately half of the artesunate dose from the double fixed-dose combination tablet until a pH of 6.8. A compressible mixture was identified and formulated to contain 200 mg of artesunate and 415 mg of mefloquine per tablet. The physical properties of the tablets complied with BP standards. An HPLC method from available literature was adapted and validated for analytical procedures. Dissolution studies according to a USP method were conducted to verify and quantify the release of the APIs in the double fixed-dose combination. The initial dissolution rate (DRi) of artesunate and mefloquine in the acidic dissolution medium was rapid as required. The enteric coated fraction of the artesunate exhibited no release in an acidic environment after 2 hours, but rapid release in a medium with a pH of 6.8. The structure of the granulated particles of mefloquine may have contributed to its first order release profile in the dissolution mediums. A linear correlation was present between the rate of mefloquine release and the percentage of mefloquine dissolved (R2 = 0.9484). Additionally, a linear relationship was found between the logarithm of the percentage mefloquine remaining against time (R2 = 0.9908). First order drug release is the dominant release profile found in the pharmaceutical industry today and is coherent with the kinetics of release obtained for mefloquine. A concept pre-clinical phase, double fixed-dose combination solid oral dosage form for artesunate and mefloquine was developed. The double fixed-dose combination was designed in accordance with the WHO’s recommendation for an oral dosage regimen of artesunate and mefloquine for the treatment of uncomplicated falciparum malaria. The specifications of the double fixed-dose combination were developed in close accordance with the FDA’s quality by design concept and WHO recommendations. An HPLC analytical procedure was developed to verify the presence of artesunate and mefloquine. The dissolution profiles of artesunate and mefloquine were investigated during the dissolution studies. / PhD (Pharmaceutics), North-West University, Potchefstroom Campus, 2014
196

Étude du traitement visuel simple et complexe chez les enfants autistes

Bertrand-Rivest, Jessica 09 1900 (has links)
Les personnes ayant un trouble du spectre autistique (TSA) manifestent des particularités perceptives. En vision, des travaux influents chez les adultes ont mené à l’élaboration d’un modèle explicatif du fonctionnement perceptif autistique qui suggère que l’efficacité du traitement visuel varie en fonction de la complexité des réseaux neuronaux impliqués (Hypothèse spécifique à la complexité). Ainsi, lorsque plusieurs aires corticales sont recrutées pour traiter un stimulus complexe (e.g., modulations de texture; attributs de deuxième ordre), les adultes autistes démontrent une sensibilité diminuée. À l’inverse, lorsque le traitement repose principalement sur le cortex visuel primaire V1 (e.g., modulations locales de luminance; attributs de premier ordre), leur sensibilité est augmentée (matériel statique) ou intacte (matériel dynamique). Cette dissociation de performance est spécifique aux TSA et peut s’expliquer, entre autre, par une connectivité atypique au sein de leur cortex visuel. Les mécanismes neuronaux précis demeurent néanmoins méconnus. De plus, on ignore si cette signature perceptuelle est présente à l’enfance, information cruciale pour les théories perceptives de l’autisme. Le premier volet de cette thèse cherche à vérifier, à l’aide de la psychophysique et l’électrophysiologie, si la double dissociation de performance entre les attributs statiques de premier et deuxième ordre se retrouve également chez les enfants autistes d’âge scolaire. Le second volet vise à évaluer chez les enfants autistes l’intégrité des connexions visuelles descendantes impliquées dans le traitement des textures. À cet effet, une composante électrophysiologique reflétant principalement des processus de rétroaction corticale a été obtenue lors d’une tâche de ségrégation des textures. Les résultats comportementaux obtenus à l’étude 1 révèlent des seuils sensoriels similaires entre les enfants typiques et autistes à l’égard des stimuli définis par des variations de luminance et de texture. Quant aux données électrophysiologiques, il n’y a pas de différence de groupe en ce qui concerne le traitement cérébral associé aux stimuli définis par des variations de luminance. Cependant, contrairement aux enfants typiques, les enfants autistes ne démontrent pas une augmentation systématique d’activité cérébrale en réponse aux stimuli définis par des variations de texture pendant les fenêtres temporelles préférentiellement associées au traitement de deuxième ordre. Ces différences d’activation émergent après 200 ms et engagent les aires visuelles extrastriées des régions occipito-temporales et pariétales. Concernant la connectivité cérébrale, l’étude 2 indique que les connexions visuelles descendantes sont fortement asymétriques chez les enfants autistes, en défaveur de la région occipito-temporale droite. Ceci diffère des enfants typiques pour qui le signal électrophysiologique reflétant l’intégration visuo-corticale est similaire entre l’hémisphère gauche et droit du cerveau. En somme, en accord avec l’hypothèse spécifique à la complexité, la représentation corticale du traitement de deuxième ordre (texture) est atypiquement diminuée chez les enfants autistes, et un des mécanismes cérébraux impliqués est une altération des processus de rétroaction visuelle entre les aires visuelles de haut et bas niveau. En revanche, contrairement aux résultats obtenus chez les adultes, il n’y a aucun indice qui laisse suggérer la présence de mécanismes supérieurs pour le traitement de premier ordre (luminance) chez les enfants autistes. / Atypical perceptual information processing is commonly described in Autism Spectrum Disorders (ASD). In the visual modality, influential work with autistic adults suggests altered connectivity within specialized local networks defining the response properties of stimulus-driven mechanisms. This has led to the development of a hypothesis that stipulates that the efficiency of autistic visual perception is contingent on the complexity of the neural network involved (Complexity-specific hypothesis). When several cortical areas must communicate with each other (as in texture-defined perception, also called second-order), reduced sensitivity to visual input is observed in autistic individuals. In contrast, when visual processing predominately relies on the primary visual cortex V1 (as in luminance-defined perception, also called first-order), their sensitivity is either enhanced (stationary stimuli) or intact (moving stimuli). This dissociation in performance is unique to ASD and suggests atypical connectivity within their visual cortex. The precise type of neural alteration remains unknown, however. In addition, studies focusing on younger individuals are needed to define the developmental trajectories of perceptual abilities in autism. This issue is crucial for perceptual theories of ASD. The first experiment aims to investigate whether the dissociation regarding first- and second-order spatial vision is also present in school-aged children with autism. We combined the use of behavioural (psychophysics) and neuroimaging (visual evoked potentials: VEPs) methods. The second experiment was designed to assess the integrity of one type of neural connections that are known to be involved in texture processing: feedback processes from extrastriate areas towards lower hierarchical levels (V1). As such, we used a visual texture segregation task and isolated a texture-segregation specific VEP component that mainly reflects feedback modulation in the visual cortex. Behavioural measures from the first experiment do not reveal differences in visual thresholds between typically developing and autistic children for both luminance- and texture-defined stimuli. With respect to electrophysiology, there is no group difference in brain activity associated with luminance-defined stimuli. However, unlike typical children, autistic children do not reliably show reliable enhancements of brain activity in response to texture-defined stimuli during time-windows more closely associated with second-order processing. These differences emerge after 200 msec post-stimulation and mainly involve extrastriate areas located over occipito-temporal and parietal scalp areas. Regarding the second experiment, the texture-segregation specific VEP component is found to be greatly diminished over the right as compared to the left occipito-lateral cortex in autism, while it shows no hemispheric asymmetry in typically developing children. In summary, in line with the complexity-specific hypothesis, cortical representation of second-order attributes (texture) is atypically reduced in autistic children. This thesis further reveals that altered feedback from extrastriate visual areas to lower areas (V1) is one of the neuronal mechanisms involved in atypical texture processing. In contrast, contrary to the results obtained in adults with autism, first-order vision (luminance) is not found to be superior in autistic children.
197

Multidimensional Methods: Applications in Drug-Enzyme Intrinsic Clearance Determination and Comprehensive Two-Dimensional Liquid Chromatography Peak Volume Determination

Thekkudan, Dennis 07 December 2009 (has links)
The goal of the first project was to evaluate strategies for determining the in vitro intrinsic clearance (CLint) of dextrorphan (DR) as metabolized by the UGT2B7 enzyme to obtain dextrorphan glucuronide (DR-G). A direct injection liquid chromatography-mass spectrometry (LC-MS) method was used to monitor products using the pseudo-first-order (PFO) model. Standard enzymatic incubations were also quantified using LC-MS. These data were fit utilizing both PFO and Michaelis-Menten (MM) models to determine estimates of kinetic parameters. The CLint was determined to be 0.28 (± 0.08) µL/min/mg protein for a baculovirus insect cell-expressed UGT2B7 enzyme. This is the first confirmation that dextrorphan is specifically metabolized by UGT2B7 and the first report of these kinetic parameters. Simulated chromatographic data were used to determine the precision and accuracy in the estimation of peak volumes in comprehensive two-dimensional liquid chromatography (2D-LC). Volumes were determined both by summing the areas in the second dimension chromatograms via the moments method and by fitting the second dimension areas to a Gaussian peak. When only two second dimension signals are substantially above baseline, the accuracy and precision are poor because the solution to the Gaussian fitting algorithm is indeterminate. The fit of a Gaussian peak to the areas of the second dimension peaks is better at predicting the peak volume when there are at least three second dimension injections above the limit of detection. Based on simulations where the sampling interval and sampling phase were varied, we conclude for well-resolved peaks that the optimum precision in peak volumes in 2D separations will be obtained when the sampling ratio is approximately two. This provides an RSD of approximately 2 % for the signal-to-noise (S/N) used in this work. The precision of peak volume estimation for experimental data was also assessed, and RSD values were in the 4-5 % range. We conclude that the poorer precision found in the 2D-LC experimental data as compared to 1D-LC is due to a combination of factors, including variations in the first dimension peak shape related to undersampling and loss in S/N due to the injection of multiple smaller peaks onto the second dimension column.
198

Kinetics of the electrocoagulation of oil and grease

Rincon, Guillermo 20 May 2011 (has links)
Research on the electrocoagulation (EC) of hexane extractable materials (HEM) has been conducted at the University of New Orleans using a proprietary bench-scale EC reactor. The original reactor configuration forced the fluid to follow a vertical upward-downward path. An alternate electrode arrangement was introduced so that the path of flow became horizontal. Both configurations were evaluated by comparing the residence time distribution (RTD) data generated in each case. These data produced indication of internal recirculation and stagnant water when the fluid followed a vertical path. These anomalies were attenuated when the fluid flowed horizontally and at a velocity higher than 0.032 m s-1 . A series of EC experiments were performed using a synthetic emulsion with a HEM concentration of approximately 700 mg l-1. It was confirmed that EC of HEM follows first-order kinetics, and kinetic constants of 0.0441 s-1 and 0.0443 s-1 were obtained from applying both the dispersion and tanks-in-series (TIS) models, respectively. In both cases R2 was 0.97. Also, the TIS model indicated that each cell of the EC behaves as an independent continuous-stirred-tank reactor.
199

Numerical Solutions of Generalized Burgers' Equations for Some Incompressible Non-Newtonian Fluids

Shu, Yupeng 11 August 2015 (has links)
The author presents some generalized Burgers' equations for incompressible and isothermal flow of viscous non-Newtonian fluids based on the Cross model, the Carreau model, and the Power-Law model and some simple assumptions on the flows. The author numerically solves the traveling wave equations for the Cross model, the Carreau model, the Power-Law model by using industrial data. The author proves existence and uniqueness of solutions to the traveling wave equations of each of the three models. The author also provides numerical estimates of the shock thickness as well as maximum strain $\varepsilon_{11}$ for each of the fluids.
200

Samhällskunskapslärares tankar om samhällsbegreppet i samhällskunskapsundervisningen. : En studie i metoden fokusgruppsintervju av nio yrkesverksamma samhällskunskapslärare på två olika gymnasieskolor. / Social science theatchers thoughts on societals concept in social studies. : A study in the focus group interview method of nine professional social sciense teatchers in two different upper secondary schools.

Andersson, Jemima January 2019 (has links)
The purpose of this qualitative study is to investigate how social science teachers perceive and express the concept of society in social studies. The study consists of focus group interviews with nine social science teachers at two upper secondary schools and its results are analyzed against the theoretical backdrop of Odenstad's orientation topics, analytical subjects and discussion topics and Sandahl’s first-order and second-order concepts. In short, the two different conceptual devices can be described as the skills and abilities that are most important for the students to master in order to develop advanced thinking skills in social science. Particular emphasis is put on critical thinking, that is, the ability to seek, structure and evaluate information from different sources and to draw conclusions from this process. The emerging results show a certain consensus on the concept of society among social science teachers as the potential subject of study and analysis that would simplify and clarify the analyses of the different levels in society which, in turn, would contribute to adding significance and bringing cohesion to the subject as a whole. As for the skills and abilities that stem from Odenstad's orientation topics and Sandahl’s first-order concepts, the interviewed teachers all emphasize conceptual ability as well as good external knowledge to have knowledge of how society is made up. With reference to Odenstad's analytical subject and discussion topics and Sandahl's second-order concepts, it would seem that it is not only important but a prerequisite that students develop an analytical ability and critical thinking as well as the ability to sift through and process large amounts of information and assume different perspectives on the topic or issue at hand.

Page generated in 0.0783 seconds