• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 9
  • 6
  • 4
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 76
  • 11
  • 11
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Chemical oxidation of tryptic digests to improve sequence coverage in peptide mass fingerprint protein identification

Lucas, Jessica Elaine 30 September 2004 (has links)
Peptide mass fingerprinting (PMF) of protein digests is a widely-accepted method for protein identification in MS-based proteomic studies. Matrix-assisted laser desorption/ionization mass spectrometry (MALDI) is the technique of choice in PMF experiments. The success of protein identification in a PMF experiment is directly related to the amount of amino acid sequence coverage. In an effort to increase the amount of sequence information obtained in a MALDI PMF experiment, performic acid oxidation is performed on tryptic digests of known proteins. Performic acid was chosen as the chemical oxidant due to the ease of use and to the selective oxidation of cysteine, methionine, and tryptophan residues. In experiments performed in our laboratory, performic acid oxidation either increased or did not affect protein sequence coverage in PMF experiments when oxidized tryptic digests were analyzed by MALDI. Negative mode MALDI data were acquired, as well as positive mode MALDI data, due to the enhanced ionization of cysteic acid-containing peptides in negative mode. Furthermore, the confidence in a protein match is increased by observation of mass shifts indicative of cysteine, methionine, and/or tryptophan in oxidized peptide ion signals when comparing MALDI spectra prior to performic acid oxidation and after oxidation due to the low abundance of these residues in the majority of all known and hypothetical proteins.
22

Reglerna om ränteavdragsbegränsningar : är tioprocentsregeln och ventilen i behov av en skärpning?

Lundqvist, Amanda January 2012 (has links)
Den 1 januari 2009 infördes regler som begränsar rätten för företag att göra ränteavdrag, vilket har till syfte att förhindra aggressiv skatteplanering som sker genom ränteupplägg. Två undantag till denna begränsade rätt till ränteavdrag benämns tioprocentsregeln och ventilen. Genom dessa får ränteavdrag göras om mottagaren av ränteinkomsten beskattas med tio procent eller om förfarandet är affärsmässigt motiverat. Dessa undantagsregler har dock blivit utsatts för kritik. Kritiken som har framförts är att den aggressiva skatteplaneringen som sker genom ränteupplägg inte minskat trots tillkomsten av reglerna om ränteavdragsbegränsningar och det finns även frågetecken om tioprocentsregeln är förenlig med etableringsfriheten inom EU. Med anledning av detta har regeringen lagt fram förslag om att skärpa reglerna om ränteavdragsbegränsningar. Syftet med denna uppsats är därför att utreda huruvida en skärpning av tioprocentsregeln och ventilen bör ske samt utreda hur en sådan skärpning, med anledning av regeringens förslag, bör utformas. I uppsatsen används en traditionell juridisk metod för att beskriva gällande rätt och uppsatsen gör anspråk på att vara rättsvetenskaplig när frågorna i syftet besvaras. En skärpning av tioprocentsregeln och ventilen är nödvändig då bolagsskattebasen idag är i behov av ett skydd som ingen av dessa två regler i sin nuvarande utformning kan ge. Dock bör ingen av de föreslagna skärpningarna angående tioprocentsregeln som regeringen lagt fram införas, då de måste anses som allt för långtgående. Angående ventilen finns det emellertid två förslag som måste anses så pass välavvägda och stärkande för ventilens utformning att dessa bör införas. / The first of January 2009 rules that limited the right for companies to make deduction of interest were introduced and the purpose was to prevent aggressive tax planning through advanced interest approaches. Two exceptions to this limitation are called the ten percent rule and the valve. Through these exceptions interest deduction can be done if the receiver of the interest is taxed by at least ten percent or if the procedure can be commercially justified. However the ten percent rule and the valve have been subject to criticism. The criticism put forward argue that the aggressive tax planning through interest approach have not declined despite the creation of the rules of interest deduction limitations and there are also ambiguities if the ten percent rule is compatible with the freedom of establishment within the EU. For this reason, the Government has submitted a proposal to tightening the rules of interest deduction limitations. Hence the purpose of this study is to investigate whether a tightening of the ten percent rule and the valve should occur and examine how such a tightening, on the occasion of the government's proposals, should be formulated. The thesis applies a traditional legal method used to describe the law and the thesis claims to be the jurisprudential when the questions in the purpose are answered. The tightening of the ten percent rule and the valve is necessary because the corporate tax base is in need of protection and neither of these two rules in their present form is able to provide this protection. However, none of the proposals regarding the ten percent rule which the government has submitted should be introduced because they must be regarded as disproportionate extensive. Regarding the valve there are two proposals that should be implemented since they must be considered well balanced and invigorating for the valve’s formulation.
23

Examination of Initialization Techniques for Nonnegative Matrix Factorization

Frederic, John 21 November 2008 (has links)
While much research has been done regarding different Nonnegative Matrix Factorization (NMF) algorithms, less time has been spent looking at initialization techniques. In this thesis, four different initializations are considered. After a brief discussion of NMF, the four initializations are described and each one is independently examined, followed by a comparison of the techniques. Next, each initialization's performance is investigated with respect to the changes in the size of the data set. Finally, a method by which smaller data sets may be used to determine how to treat larger data sets is examined.
24

ROBUST STATISTICAL METHODS FOR NON-NORMAL QUALITY ASSURANCE DATA ANALYSIS IN TRANSPORTATION PROJECTS

Uddin, Mohammad Moin 01 January 2011 (has links)
The American Association of Highway and Transportation Officials (AASHTO) and Federal Highway Administration (FHWA) require the use of the statistically based quality assurance (QA) specifications for construction materials. As a result, many of the state highway agencies (SHAs) have implemented the use of a QA specification for highway construction. For these statistically based QA specifications, quality characteristics of most construction materials are assumed normally distributed, however, the normality assumption can be violated in several forms. Distribution of data can be skewed, kurtosis induced, or bimodal. If the process shows evidence of a significant departure from normality, then the quality measures calculated may be erroneous. In this research study, an extended QA data analysis model is proposed which will significantly improve the Type I error and power of the F-test and t-test, and remove bias estimates of Percent within Limit (PWL) based pay factor calculation. For the F-test, three alternative tests are proposed when sampling distribution is non-normal. These are: 1) Levene’s test; 2) Brown and Forsythe’s test; and 3) O’Brien’s test. One alternative method is proposed for the t-test, which is the non-parametric Wilcoxon - Mann – Whitney Sign Rank test. For PWL based pay factor calculation when lot data suffer non-normality, three schemes were investigated, which are: 1) simple transformation methods, 2) The Clements method, and 3) Modified Box-Cox transformation using “Golden Section Search” method. The Monte Carlo simulation study revealed that both Levene’s test and Brown and Forsythe’s test are robust alternative tests of variances when underlying sample population distribution is non-normal. Between the t-test and Wilcoxon test, the t-test was found significantly robust even when sample population distribution was severely non-normal. Among the data transformation for PWL based pay factor, the modified Box-Cox transformation using the golden section search method was found to be the most effective in minimizing or removing pay bias. Field QA data was analyzed to validate the model and a Microsoft® Excel macro based software is developed, which can adjust any pay consequences due to non-normality.
25

EVALUATION OF THE BODY COMPOSITION OF FEMALE COLLEGIATE ATHLETES USING THE BOD POD

Glodt Baker, Adrienne Jennifer 01 January 2012 (has links)
The body composition of female collegiate athletes was measured using the Bod Pod® device. The sample consisted of 75 student athletes, aged 18 to 22 years old. Five sports at the university level were represented, including basketball, gymnastics, soccer, swimming & diving, and soccer. Participants were measured at the preseason and postseason periods. Overall, participants in all five sports were not found to change significantly in total body mass, fat mass, fat free mass, percent body fat, or body mass index from the preseason period to the postseason period at the alpha = 0.05 level. On average, the members from each of the different teams were found to be significantly different from each other for one or more variables. In general, basketball and volleyball players were found to be similar in body composition. The average member on the swimming & diving, soccer, and gymnastics teams was found to vary from the average team member on each of the other teams.
26

Accuracy In Body Composition Assessment With Three Different Methods Compared To Dexa

Duz, Serkan 01 January 2003 (has links) (PDF)
The purpose of this study was to investigate differences among the percent body fat (%BF) values of Turkish sedentary male and female university students measured by dual-energy x-ray absorptiometry (DEXA), skinfold (SKF), ultrasound (US) and hand to hand bioelectrical impedance analysis (BIA). Two hundred eight Turkish university students (one hundred four males and one hundred four females) aged between 18 to 26 years old participants participated in this study voluntarily. %BF assessment was performed by the SKF, US, BIA and DEXA methods. Differences among DEXA, SKF, US and BIA were examined by applying a series of paired-t test. Multiple regression analyses were conducted to developed regression equations to predict %BF from SKF and US measurements. Results demonstrated that there were significant differences between DEXA and SKF, US, and BIA measurements for males and females. The mean %BF derived from DEXA was significantly (p&lt / .001) greater than those of SKF, US and BIA for males and females. Multiple regression analyses showed that SKF and US measurement of subcutaneous fat at three-sites gave the best prediction to %BF for male and female separately. The multiple correlations using three sites simultaneously for men and women were r=0.92, SEE=2.4 and r=0.91, SEE=2.8 for SKF and r=0.93, SEE=2.3 and r=0.90, SEE=3.0 for US, respectively. In summary, with the new regression equation US appears to be a reliable, portable, and non-invasive tool which can be used by any field investigator on obese or thin individuals. Finally, new regression equations developed do not seem to be superior to those reported using calipers.
27

A Framework for Monitorable Services Implementation

CARDOSO, David Menezes 16 February 2012 (has links)
Submitted by Pedro Henrique Rodrigues (pedro.henriquer@ufpe.br) on 2015-03-04T17:46:47Z No. of bitstreams: 2 dissertation_dmc4_final.pdf: 2547061 bytes, checksum: 37ab52f20fadfed5e6ba1cfb7649f971 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) / Made available in DSpace on 2015-03-04T17:46:47Z (GMT). No. of bitstreams: 2 dissertation_dmc4_final.pdf: 2547061 bytes, checksum: 37ab52f20fadfed5e6ba1cfb7649f971 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2012-02-16 / Since the very first graphical user interfaces, progress indicators have been widely used to provide feedback regarding the execution of a system long-running tasks. In fact, practical experience and formal experiments suggest that such indicators are an important user interface tool, as they enhance the attractiveness and effectiveness of the programs that incorporate them. However, in order to make progress feedback possible, the system services involved must provide on-line monitoring capabilities. As the software systems become increasingly larger and more complex — often involving complex interactions between various different components and abstraction layers —, the crosscutting nature of monitoring concerns can introduce several inherent challenges to the software development: (1) code quality degradation with respect to tangling and scattering; (2) costly software evolution and maintenance difficulties; (3) absence of specific development patterns and regular standardized process guidance; (4) loss of development productivity; and (5) inconsistent monitoring results. In this context, this work provides an analysis of monitoring requirements, possible approaches towards its implementation along with an analysis of the main benefits and weaknesses involved. Furthermore, it is proposed and evaluated a solution to aid the software development by overcoming the monitoring-related inherent challenges, rather by mitigating or completely eliminating the problems. The solution consists of a framework, extended libraries, and generic software process guidelines regarding the monitoring requirements, with focus on the Rational Unified Process (RUP), for exemplification purpose, but not limited to it.
28

Effect of Exergaming on Physical Activity of Adults with Intellectual Disabilities

Vergara, Jennifer Dawn 03 November 2017 (has links)
Physical inactivity is the fourth leading risk factor related to death (World Health Organization, 2016a). Thus, the World Health Organization (2016a) suggests engaging in at least 150 min of physical activity (PA) throughout the week. Many individuals with intellectual disabilities (ID) engage in sedentary lifestyles that raise concern about their long-term health. The purpose of this study was to evaluate the effects of exergaming on PA and intensity when implemented with adults with ID. Four adult males diagnosed with ID were recruited. During the scheduled phase, percent occurrence of PA was variable across both conditions for each participant. During the choice phase, all participants chose the exergaming condition. All ratings of intensity were attainable across both conditions for all participants. Results varied across participants. Participants reported high acceptability for exergaming.
29

Diagnostic Accuracy of Nonword Repetition Tasks for the Clinical Assessment of Spanish-English Dual Language Learners: A Preliminary Investigation

Czirr, Audrey 14 June 2022 (has links)
Nonword repetition (NWR) has demonstrated significant potential as a less-biased language assessment measure for dual language learners (DLLs). However, there are currently no available guidelines for the use of NWR in a clinical setting. The purpose of this preliminary study is to develop initial recommendations for the clinical use of NWR tasks by determining the diagnostic accuracy and optimal cut-off scores for two NWR tasks and scoring methods, and to evaluate the clinical feasibility of NWR as an assessment measure. Participants included 23 DLL students with and without language disorder between the ages of 6 and 8. Spanish and English NWR tasks were administered in school classrooms and scored by percent phonemes correct (PPC) and number of whole words correct. Optimal cut-off scores resulting in the best sensitivity and specificity were calculated for each task and scoring method. Diagnostic accuracy was then compared for each task, combination of tasks, and scoring method. English PPC, Spanish PPC, and combined whole word scores yielded acceptable levels of sensitivity and specificity. Combined PPC scores resulted in excellent specificity, but inadequate sensitivity. Whole word scores for the tasks individually did not approach acceptable diagnostic accuracy. The current findings suggest that NWR can be feasibly implemented in the clinical setting and yield accurate results. English-Spanish whole word scores show potential as an accurate assessment measure for DLL children but should be investigated further. English-Spanish PPC scores appear to be appropriate for ruling out a language disorder, but are insufficient for ruling one in. These results provide preliminary support for the use of NWR tasks in the clinical assessment of DLLs as well as initial recommendations for their administration and interpretation.
30

The Use of Nonword Repetition Tasks in the Assessment of Developmental Language Disorder in Bilingual Children

Kelly, Kirsten 17 June 2021 (has links)
To address the needs of the growing number of Spanish-English bilingual children in the United States, Nonword Repetition (NWR) tasks were created to reduce testing bias in the assessment and diagnosis of children with developmental language disorder (DLD). Several studies have shown promising results in the use of NWR tasks; however, fewer studies have addressed questions such as the use of different scoring methods or analyzing error patterns. Thus, this study was conducted to address these gaps in the research. An English and a Spanish NWR task were administered to 26 Spanish-English bilingual school aged children (6;0-9;4). Two different scoring methods (percent phoneme correct and whole word scoring) were compared for diagnostic accuracy and the types and frequency of errors were analyzed. Both scoring methods showed statistically significant differences between groups (participants with DLD and those with typically developing language). Whole word scoring in Spanish had the best diagnostic accuracy, according to sensitivity, specificity, and likelihood ratio measures. However, due to the small number of nonwords that any participant repeated correctly, this may not be a clinically practical scoring method. The Spanish NWR task was a better measure than the English NWR task in identifying children with DLD, suggesting that Spanish NWR could be used to assess DLD in bilingual children. Participants with DLD produced more consonant, vowel, substitution, and omission errors than those with typically developing language. There was no difference between groups for addition errors. Significantly more omission errors were made in Spanish, likely due to the longer nonwords. The longer nonwords may be key in distinguishing between typically developing children and those with DLD. These results have the potential to inform future clinical practices in selecting, scoring, and analyzing NWR tasks.

Page generated in 0.0463 seconds