• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 6
  • 5
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 62
  • 62
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Net pay evaluation: a comparison of methods to estimate net pay and net-to-gross ratio using surrogate variables

Bouffin, Nicolas 02 June 2009 (has links)
Net pay (NP) and net-to-gross ratio (NGR) are often crucial quantities to characterize a reservoir and assess the amount of hydrocarbons in place. Numerous methods in the industry have been developed to evaluate NP and NGR, depending on the intended purposes. These methods usually involve the use of cut-off values of one or more surrogate variables to discriminate non-reservoir from reservoir rocks. This study investigates statistical issues related to the selection of such cut-off values by considering the specific case of using porosity () as the surrogate. Four methods are applied to permeability-porosity datasets to estimate porosity cut-off values. All the methods assume that a permeability cut-off value has been previously determined and each method is based on minimizing the prediction error when particular assumptions are satisfied. The results show that delineating NP and evaluating NGR require different porosity cut-off values. In the case where porosity and the logarithm of permeability are joint normally distributed, NP delineation requires the use of the Y-on-X regression line to estimate the optimal porosity cut-off while the reduced major axis (RMA) line provides the optimal porosity cut-off value to evaluate NGR. Alternatives to RMA and regression lines are also investigated, such as discriminant analysis and a data-oriented method using a probabilistic analysis of the porosity-permeability crossplots. Joint normal datasets are generated to test the ability of the methods to predict accurately the optimal porosity cut-off value for sampled sub datasets. These different methods have been compared to one another on the basis of the bias, standard error and robustness of the estimates. A set of field data has been used from the Travis Peak formation to test the performance of the methods. The conclusions of the study have been confirmed when applied to field data: as long as the initial assumptions concerning the distribution of data are verified, it is recommended to use the Y-on-X regression line to delineate NP while either the RMA line or discriminant analysis should be used for evaluating NGR. In the case where the assumptions on data distribution are not verified, the quadrant method should be used.
12

A Study of Monitor Chips Applied to Notebook Power Management System

Liao, Ying-Chien 25 October 2004 (has links)
This paper aims on the study of developing the firmware program for the monitor chips designed inside the battery set of a Notebook power management system, with the function of monitoring safety during battery charge/discharge via the chips; meanwhile, to estimate the residual capacity of the battery. Owing to the chemistry properties of the battery, whose residual capacity will be affected by the current flow during charge/discharge, high/low of the ambient temperature, fatigue of the battery, as a result, variations of the residual capacity will be presented in non-linear. Therefore, in estimation of the battery residual capacity, using the curve learned from the practical experiment on the battery charge/discharge to be the basis for us to find out the appropriate parameters under the relevant influence factors for revision. It will be more accurate in estimation of the battery residual capacity. At the same time, the battery signal can be transmitted to the managing end of the Notebook power management system via the system management interface, which may enable the system to operate more efficiently.
13

Equipping believers to explain in a clear and culturally appropriate way the New Testament meaning of saving grace to a Roman Catholic Cajun in a short conversation

Jemison, William Dearing. January 2000 (has links)
Project Thesis (D. Min.)--Denver Seminary, 2000. / Includes bibliographical references (leaves 394-408).
14

Development of a lower intestine targeting mucoadhesive platform of oral drug delivery

Jang, Shih-Fan 02 July 2013 (has links)
Our goal was to develop a mucoadhesive, oral vaccination delivery platform designed to target Peyer’s patches at ileum. In order to achieve this, we prepared poly(methyl methacrylate) (PMMA) particles of various sizes using W/O/W emulsification solvent evaporation and surface polymerization methods. We then coated and employed mucoadhesive polymers into the carrier system to enhance the residence time in the targeted site. Also we developed our own in vitro mucoadhesion testing ramp as an evaluation tool. Finally, nano- and micro-structured particles were manufactured as two different oral vaccine delivery systems (Solid Lipid Nanoparticles, SLNs; and Protein Coated Microcrystals, PCMC). After the model antigen, bovine serum albumin (BSA) was loaded into the SLNs or PCMC; mucoadhesive polymers were then incorporated and formulated the mixture into pellets. The pellets were then layered with an enteric coating, which was composed of a mixture of Eudragit® FS 30 D/Eudragit® L 30 D-55 for ileum targeted delivery. The in vitro mucoadhesion test ramp was capable of investigating the mucoadhesive properties of tablets and pellets, providing a rank order for study. Most important of all, it was anticipated that this might reduce the burden of testing animals for future proposed mucoadhesive studies. Microcapsules/beads of specific size were manufactured reproducibly by solvent evaporation and surface polymerization. Although we could not specify the cut-off size at the pyloric sphincter in mice, we concluded that the cut-off size at the pyloric sphincter in rats was approximately 2.5-3 mm, which was supported by both the biodistribution data and the direct image results from scintigraphy scanning. Moreover, we found that the particle size significantly alters the gastric emptying time in both rodent models. The small microcapsules/beads were hindered in the folds of the stomach (size 50-100μm for mice and size 0.5-1 mm for rats) and emptied the slowest, followed by the large particles, then the medium particles. Finally, PCMC and SLNs we manufactured were suitable carriers for protein API, such as BSA. These particles were of fitting size for M cell uptake, which would possibly induce mucosal immune responses. Therefore, an antigen containing PCMC and SLNs might be suitable platforms for oral vaccination. / text
15

The Effect of the Cut Off Rules of the Bateria Woodcock-Munoz Pruebas de Habilidad Cognitiva-Revisada on the Identification and Placement of Monolingual and Bilingual Spanish Speaking Students in Special Education: A Cross-cultural Study

Chacon, Vanessa January 2007 (has links)
This study was designed to investigate if the Batería Woodcock-Muñoz: Pruebas de Habilidad Cognitiva- Revisada is a valid cross-cultural tool to measure the cognitive ability students of three Spanish-speaking groups from two different Spanish-speaking countries. One group is represented by culturally diverse bilingual Spanish dominant students in Tucson, Arizona since there is an overrepresentation of bilingual students receiving special education services in all school districts in this area. The second group consists of monolingual Spanish-speakers from Costa Rica referred for special education. The third group constitutes monolingual Spanish speakers from Costa Rica performing at grade level.This research analyzed whether or not Memory for Sentences, a sub-test of Short Term Memory, Visual Integration and Picture Recognition sub-tests of Visual Processing in the Psycho-educational Batería Woodcock-Muñoz, is more difficult for the special education Spanish/bilingual population in Tucson than for the monolingual Spanish-speaking special education and grade level individuals in Costa Rica. Item p-value differences in each subtest were estimated and compared for all items for each subtest to detect if a major item difficulty order difference existed between Spanish-speaking groups that could be indicative of internal criteria of test bias. Results show that the item order of difficulty affects the tests' established cut off rules for both Costa Rican populations in the Memory for Sentences test, making it invalid for these populations; and that the Tucson sample group's performance is lower than that of both Costa Rican groups. In addition, both Visual Processing subtests are invalid for all groups compared since the item order of difficulty does not match the test item order, thus affecting the enforcement of the cut off rules and making these subtests invalid for these populations.Standardized assessments and intelligence trait are considered the results of mathematical and statistical expressions built on test developers' own cultural views and minds. They follow along the lines of the traditional reductionist assessment or scientific/medical models. As a result, it is concluded that bilingual populations will be at disadvantage because standardized assessment neither links assessment to familiar language, cultural relevant information, and experiences nor considers how the bilingual mind processes information.
16

Equipping believers to explain in a clear and culturally appropriate way the New Testament meaning of saving grace to a Roman Catholic Cajun in a short conversation

Jemison, William Dearing. January 2000 (has links) (PDF)
Project Thesis (D. Min.)--Denver Seminary, 2000. / Includes bibliographical references (leaves 394-408).
17

Equipping believers to explain in a clear and culturally appropriate way the New Testament meaning of saving grace to a Roman Catholic Cajun in a short conversation

Jemison, William Dearing. January 2000 (has links)
Project Thesis (D. Min.)--Denver Seminary, 2000. / Includes bibliographical references (leaves 394-408).
18

Algorithmic multiparameterised verification of safety properties:process algebraic approach

Siirtola, A. (Antti) 28 September 2010 (has links)
Abstract Due to increasing amount of concurrency, systems have become difficult to design and analyse. In this effort, formal verification, which means proving the correctness of a system, has turned out to be useful. Unfortunately, the application domain of the formal verification methods is often indefinite, tools are typically unavailable, and most of the techniques do not suit especially well for the verification of software systems. These are the questions addressed in the thesis. A typical approach to modelling systems and specifications is to consider them parameterised by the restrictions of the execution environment, which results in an (infinite) family of finite-state verification tasks. The thesis introduces a novel approach to the verification of such infinite specification-system families represented as labelled transition systems (LTSs). The key idea is to exploit the algebraic properties of the correctness relation. They allow the correctness of large system instances to be derived from that of smaller ones and, in the best case, an infinite family of finite-state verification tasks to be reduced to a finite one, which can then be solved using existing tools. The main contribution of the thesis is an algorithm that automates the reduction method. A specification and a system are given as parameterised LTSs and the allowed parameter values are encoded using first order logic. Parameters are sets and relations over these sets, which are typically used to denote, respectively, identities of replicated components and relationships between them. Because the number of parameters is not limited and they can be nested as well, one can express multiply parameterised systems with a parameterised substructure, which is an essential property from the viewpoint of modelling software systems. The algorithm terminates on all inputs, so its application domain is explicit in this sense. Other proposed parameterised verification methods do not have both these features. Moreover, some of the earlier results on the verification of parameterised systems are obtained as a special case of the results presented here. Finally, several natural and significant extensions to the formalism are considered, and it is shown that the problem becomes undecidable in each of the cases. Therefore, the algorithm cannot be significantly extended in any direction without simultaneously restricting some other aspect.
19

Použití metody kvantifikace DNA jako screeningového nástroje pro efektivní genotypování vzorků ve forenzní DNA laboratoři. / Using of quantitative DNA method as a screening tool for effecient genotyping of samples in forensic DNA laboratory.

Koljenšič, Ivana January 2011 (has links)
Quantification of human DNA in forensic samples is an important step during STR profiling because the STR genotyping is sensitive to the quantity of DNA used in the PCR reaction. This study focuses on the importance of quantification in the entire process of genetic analysis. Two real time PCR platforms (Roche LightCycler480 System and ABI 7900 RT PCR) were used to compare two commercial kits in terms of DNA quantification. It was found out that accuracy of absolute quantification values in commercial quantification kits is strongly dependent on the construction of calibration curve. Especially low template DNA samples were used to assess whether QuantifilerTM or Plexor® HY System can determinate a minimum quantification value (cut off value) below which STR profiles would consistently fail to be detected. The usage of Plexor® HY System enabled to determine the cut off quantification value more exactly probably due to different molecular background and chemistry used in this kit. Reliability and other issues connected with cut off value are discussed. In order to better understand the relationship between the quantity of DNA and the number of detectable loci series the dilution experiment with standard DNA007 was done. Quantitative and qualitative consequences of input DNA amount in evaluation of...
20

The Development of a Reliable Change Index and Cutoff for the SCORE-15

Nebeker Adams, Cara Ann 01 December 2018 (has links)
The Systemic Clinical Outcome and Routine Evaluation version 15 (SCORE-15) is an assessment used to assess for clinical change in family functioning. The SCORE-15 has been demonstrated in the past to be a reliable and valid measure for assessing for clinical change and is largely used throughout the UK. However, the SCORE-15 lacks the ability to determine whether an individual's change in family functioning is clinically significant. This study aims to establish a reliable change index and clinical cutoff score based on a US sample so that researchers and clinicians can determine clinically significant change. A sample of 63 clinical participants and 244 community participants completed the SCORE-15, including 165 community participants who completed the SCORE-15 a second time. Results established a cutoff of 51.92 and a reliable change index of 17.51 for the SCORE-15. This indicates that therapy clients who improve their SCORE-15 score by at least 17.5 points and who cross the threshold of 52 during the course of therapy are considered to have experienced clinical significant improvement.

Page generated in 0.0461 seconds