• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 45
  • 45
  • 10
  • 9
  • 7
  • 7
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Implementação de uma análise computadorizada da curva de emissão termoluminescente e aplicação em dosimetria clínica / Implementation of a computerized glow curve analysis and application in clinical dosimetry

Marcela Felix Chaves Ferreira 28 February 2018 (has links)
Nas décadas de 1960 e 1970, as primeiras investigações de dosímetros termoluminescentes (TLD), especificamente, os picos dosimétricos, rapidamente revelaram um número surpreendente de fenômenos que poderiam estar diretamente relacionados à densidade de ionização. Um pouco mais tarde, nos anos 80 e no início dos anos 90, a radiação aparentemente desconectada induziu fenômenos que foram descobertos em outros sistemas baseados em fluoreto de lítio (LiF). A última década, no entanto, testemunhou o surgimento de vários modelos, encabeçado através de uma compreensão mais profunda dos mecanismos TL subjacentes, bem como na modelagem micro dosimétrica e especificamente desenvolvida para explicar fenômenos de densidade de ionização. Muitas aplicações em radioterapia fornecem níveis de dose de radiação superiores a 1 Gy, porém em radiodiagnóstico estão na faixa de alguns mGy, e níveis muito altos de precisão são necessárias para promover o tratamento ideal. Isto exige uma atenção muito cuidadosa aos protocolos de medição altamente detalhados, bem como à calibração demorada de todos os TLDs para corrigir a não-linearidade da resposta à dose. Essas propriedades podem variar de lote para lote e também podem ser uma função da exposição à radiação, do aquecimento e histórico de manuseio. Deste modo, mesmo com excelentes avanços nos estudos do TLD com relação aos tratamentos térmicos e às formas de análise da curva de emissão TL, é necessário continuar os estudos a fim de possibilitar uma melhor utilização desta técnica na clínica. Uma análise computadorizada da curva de emissão (CGCA do inglês, computadorized glow curve analisys) foi implementada utilizando dados provenientes do software WinREMS de dosímetros TL que absorvem e armazenam a energia da radiação ionizante, reemitida na forma de fóton na região do ultravioleta visível. A luz emitida é, então, detectada por uma fotomultiplicadora e correlacionada à dose absorvida recebida pelo material. Os picos de emissão foram ajustados por meio de um algoritmo no programa MATLAB adotando-se o modelo de cinética de primeira ordem. O material testado foi o LiF:Mg,Ti (fluoreto de lítio dopado com magnésio e titânio) da marca Harshaw e a qualidade do ajuste foi determinada por um parâmetro chamado figura de mérito (FOM - do inglês, figure of merit). O menor FOM obtido para o grupo de dosímetros foi de 1,04 % e o maior foi de 9,79 %. Também foi avaliada a dose mínima detectável, utilizando o parâmetro que apresentou melhor desempenho, segundo a homogeneidade do grupo de dosímetros. O valor médio de dose mínima apresentado foi 28 µGy. Os resultados de reprodutibilidade, índice de variabilidade do detector (DVI - do inglês, device variability index) foi 14,01 %, que pode ser explicado pelo alto número de dosímetros no lote. Então, com a diminuição do tempo de preparo do dosímetro e com a análise computadorizada da curva de emissão, a utilização clínica do TLD torna-se mais viável, visto que não houve interferência na sensibilidade do dosímetro. Apesar de a reprodutibilidade ter sido a cima do esperado, é indicado uma correção individual para cada dosímetro e o descarte daqueles que apresentarem valores mais discrepantes comparado ao lote. / In the decades of 1960 and 1970, the first investigations of termoluminescentes dosimeters (TLD), specifically, the dosimetric peaks quickly revealed a surprising number of phenomena that could be directly related to the density of ionization. A little later, in the years 80 and 90 at the beginning of the year, seemingly disconnected radiation induced phenomena were discovered on other systems based on lithium fluoride (LiF). The last decade, however, has witnessed the emergence of several models, spearheaded through a deeper understanding of the underlying TL mechanisms as well as in modeling specifically developed for microdosimetric and explain phenomena of ionization density. Many applications in radiation oncology provide levels of radiation dose in excess of 1 Gy, however in diagnostic radiology are in the range of a few mGy, and very high levels of precision are necessary to promote the ideal treatment. This requires careful attention to the highly detailed measurement protocols, as well as the time-consuming calibration of all TLDs to correct the non-linearity of dose-response. These properties can vary from batch to batch, and can also be a function of exposure to radiation, heating and handling history. In this way, even with excellent advances in the studies of the TLD for the heat treatment and the ways of issuing TL curve analysis, it is necessary to continue studies in order to enable a better use of this technique in the clinic. A computerized analysis of the emission curve (CGCA computadorized glow curve analysis) was implemented using data from the WinREMS software of TL dosimeters that absorb and store energy from ionizing radiation, reissued in the form of photon in the visible ultraviolet. The light emitted is then detected by a photomultiplier and correlated to the absorbed dose received by the material. The emission peaks were adjusted by means of an algorithm in MATLAB program by adopting the model of first-order kinetics. The material tested was the LiF: Mg, Ti (lithium fluoride doped with magnesium and titanium) brand Harshaw and the quality of the fit was determined by a parameter called figure of merit (FOM- figure of merit). The smallest FOM obtained for the group of dosimeters was 1.04% and the highest was 9.79%. Also minimum detectable dose was evaluated, using the parameter that showed better performance, according to the homogeneity of the Group of dosimeters. The average value of minimum dose presented was 28 µGy. The results of reproducibility, index of variability of the detector (DVI-English, device variability index) was 14.01%, which can be explained by the high number of dosimeters in the batch. Then, with the decrease in the time of preparation of the dosimeter and the computerized analysis of the emission curve, the clinical use of the TLD becomes more viable, since there was no interference on sensitivity of the dosimeter. Although the reproducibility have been above expectations, indicated a single correction for each badge and the disposal of those who submit more discrepant values compared to the batch.
22

Tongue Twisters Quantified: Ultrasound Analysis of Speech Stability and Speech Errors

Reddick, Karen 30 June 2016 (has links)
This thesis investigates errors on speech sounds (or phonemes) produced in laboratory speech stimuli designed to generate phonological onset errors. The present study adds to the literature on phonological speech errors with an instrumental analysis of tongue posture during speech error production and an investigation of the nature of speech errors as unintended variation in articulation. This study utilized ultrasound instrumentation to visualize speech errors made on velar and alveolar stop consonants at the point of stop closure. Two types of errors were of interest, categorical errors and gradient errors. Categorical errors are those that are heard by the listener and instrumentally appear to be a correct production of an incorrect target. Gradient errors are those that are usually heard to be the correct target, but on instrumental examination display characteristics of an incorrect production. Six participants repeated eight tongue twisters in both a baseline and an experimental condition. This study was interested in errors produced on the onset stop consonant pairs /t, d/ and /k, g/. Recordings were transcribed to determine the perceptual identity of each target. Ultrasound videos were then analyzed and an individual frame representing the articulatory posture for each closure was extracted. These frames were fit with a smoothing spline curve using Edgetrak software. A curve-to-curve analysis based on the methods of Zharkova (2009) was conducted as a means of further investigating variation in individual speakers as well as providing a quantitative measure of errors. Results from the six speakers showed that all produced both categorical and gradient errors. The speakers showed individual variation in the stability of their productions and overall rate of errors. There was an observable trend for speakers who were more stable in their baseline productions to produce fewer errors, both gradient and categorical, in the experimental portion. Conversely, those speakers who exhibited more variation in their baseline productions had a higher rate of error under the experimental condition.
23

The XMM-Newton EPIC X-ray Light Curve Analysis of WR 6.

Ignace, Richard, Gayley, K., Hamann, W.-R., Huenemoerder, D., Oskinova, L., Pollock, A., McFall, M. 20 September 2013 (has links) (PDF)
We obtained four pointings of over 100 ks each of the well-studied Wolf-Rayet star WR 6 with the XMM-Newton satellite. With a first paper emphasizing the results of spectral analysis, this follow-up highlights the X-ray variability clearly detected in all four pointings. However, phased light curves fail to confirm obvious cyclic behavior on the well-established 3.766 d period widely found at longer wavelengths. The data are of such quality that we were able to conduct a search for "event clustering" in the arrival times of X-ray photons. However, we fail to detect any such clustering. One possibility is that X-rays are generated in a stationary shock structure. In this context we favor a co-rotating interaction region (CIR) and present a phenomenological model for X-rays from a CIR structure. We show that a CIR has the potential to account simultaneously for the X-ray variability and constraints provided by the spectral analysis. Ultimately, the viability of the CIR model will require both intermittent long-term X-ray monitoring of WR 6 and better physical models of CIR X-ray production at large radii in stellar winds.
24

Discrimination between sincere and deceptive isometric grip response using Segmental Curve Analysis

Stout, Molly L. 12 September 2009 (has links)
This investigation was conducted to explore the between trial variability of the measures of the isometric peak force, time to peak force, area to peak force, area under the curve, slope (20%-80%), and the average slope of subjects assigned to perform a series of four isometric grip strength contractions and to develop a discriminant function equation that would predict group membership. Forty-nine college students were instructed to perform either a series of four maximal voluntary contractions (sincere) or a series of four submaximal (deceptive) contractions. The subjects were retested 24-48 hours after the initial test session. Data from both test sessions were recorded, displayed, and analyzed using segmental curve analysis. The coefficients of variation were computed for each test variable. The grand mean coefficient of variation for the sincere condition was .31 ± .02 compared to the grand mean coefficient of variation for the deceptive condition which was .77 ± .11 (p < .01). Coefficients of variation were used to predict group membership. The prediction equation accurately classified 92% of the sincere condition and 64% of the deceptive condition. / Master of Science
25

A New Series of Rate Decline Relations Based on the Diagnosis of Rate-Time Data

Boulis, Anastasios 14 January 2010 (has links)
The so-called "Arps" rate decline relations are by far the most widely used tool for assessing oil and gas reserves from rate performance. These relations (i.e., the exponential and hyperbolic decline relations) are empirical where the starting point for their derivation is given by the definitions of the "loss ratio" and the "derivative of the loss ratio", where the "loss ratio" is the ratio of rate data to derivative of rate data, and the "derivative of the loss ratio" is the "b-parameter" as defined by Arps [1945]. The primary goal of this work is the interpretation of the b-parameter continuously over time and thus the better understanding of its character. As is shown below we propose "monotonically decreasing functional forms" for the characterization of the b-parameter, in addition to the exponential and hyperbolic rate decline relations, where the b-parameter is assumed to be zero and constant, respectively. The proposed equations are as follow: b(t)=constant (Arps' hyperbolic rate-decline relation), []tbbtb10exp)(-bt= (exponential function), (power-law function), 10)(btbtb=)/(1)(10tbbtb+= (rational function). The corresponding rate decline relation for each case is obtained by solving the differential equation associated with the selected functional for the b-parameter. The next step of this procedure is to test and validate each of the rate decline relations by applying them to various numerical simulation cases (for gas), as well as for field data cases obtained from tight/shale gas reservoirs. Our results indicate that b-parameter is never constant but it changes continuously with time. The ultimate objective of this work is to establish each model as a potential analysis/diagnostic relation. Most of the proposed models yield more realistic estimations of gas reserves in comparison to the traditional Arps' rate decline relations (i.e., the hyperbolic decline) where the reserves estimates are inconsistent and over-estimated. As an example, the rational b-parameter model seems to be the most accurate model in terms of representing the character of rate data; and therefore, should yield more realistic reserves estimates. Illustrative examples are provided for better understanding of each b-parameter rate decline model. The proposed family of rate decline relations was based on the character of the b-parameter computed from the rate-time data and they can be applied to a wide range of data sets, as dictated by the character of rate data.
26

Developing a School Social Work Model for Predicting Academic Risk: School Factors and Academic Achievement

Lucio, Robert 21 October 2008 (has links)
The impact of school factors on academic achievement has become an important focus for school social work and revealed the need for a comprehensive school social work model that allows for the identification of critical areas to apply social work services. This study was designed to develop and test a more comprehensive school social work model. Specifically, the relationship between cumulative grade point average (GPA) and the cumulative risk index (CRI) and an additive risk index (ARI) were tested and a comparison of the two models was presented. Over 20,000 abstracts were reviewed in order to create a list of factors which have been shown in previous research to impact academic achievement. These factors were divided into the broad domains of personal factors, family factors, peer factors, school factors, and neighborhood or community factors. Factors that were placed under the school domain were tested and those factors which met all three criteria were included in the overall model. Consistent with previous research, both the CRI and ARI were shown to be related to cumulative GPA. As the number of risk factors increased, GPA decreased. After a discussion of the results, a case was made for the use of an additive risk index approach fitting more with the current state of social work. In addition, selecting cutoff points for determining risk and non-risk students was accomplished using an ROC analysis. Finally, implications for school social work practice on the macro-, meso-, and micro- levels were discussed.
27

Разработка математического и программного обеспечения для анализа кривой обучения школьников : магистерская диссертация / Development of mathematical and software for the analysis of the learning curve of schoolchildren

Исаков, Э. Н., Isakov, E. N. January 2021 (has links)
Актуальность темы обусловлена потребностью школы в создании автоматизированной информационной системы для тестирования учеников и отработки учебного материала с инструментарием для анализа результатов, чтобы с помощью данной системы повысить качество образовательной программы и автоматизировать процесс проверки пройденных школьниками тестов. Научная новизна состоит в том, что создана информационная система, располагающая инструментами сбора и анализа данных для исследований, разработана формула для расчета показателя уровня усвоения материала учеников для построения кривой обучения. Практическая значимость заключается в том, что данная информационная система будет применяться в школе для отработки учебного материала и анализа результатов учеников. / The relevance of the topic is due to the need of the school to create an automated information system for testing students and working out educational material with tools for analyzing the results in order to use this system to improve the quality of the educational program and automate the process of checking the tests passed by students. The scientific novelty consists in the fact that an information system has been created that has tools for collecting and analyzing data for research, a formula has been developed for calculating the indicator of the level of assimilation of students' material to build a learning curve. The practical significance lies in the fact that this information system will be used at school to work out educational material and analyze the results of students.
28

Acurácia diagnóstica da variação da pressão de pulso mensurada em artéria periférica para predição de diferentes aumentos do volume sistólico em resposta ao desafio volêmico em cães

Dalmagro, Tábata Larissa. January 2019 (has links)
Orientador: Francisco José Teixeira-Neto / Resumo: Objetivo – Determinar a acurácia diagnóstica da variação da pressão de pulso (ΔPP) mensurada em artéria periférica na predição de diferentes aumentos no volume sistólico induzidos por um desafio volêmico em cães. Metodologia – Foram incluídos 39 cães, fêmeas (19,3 ± 3,6 kg) submetidas à ovariohisterectomia eletiva. A anestesia foi mantida com isoflurano sob ventilação mecânica controlada a volume (volume corrente 12 mL/kg; pausa inspiratória durante 40% do tempo inspiratório; relação inspiração:expiração 1:1,5). O débito cardíaco foi obtido através da técnica de termodiluição transpulmonar (cateter na artéria femoral) e o ΔPP foi mensurado através de um cateter posicionado na artéria podal dorsal. A fluido-responsividade (FR) foi avaliada através da administração de um (n = 21) ou dois (n = 18) desafios volêmicos com solução de Ringer Lactato (RL, 20 mL/kg durante 15 minutos), antes do procedimento cirúrgico. A análise da curva “receiver operating characteristics” (ROC) e a zona de incerteza diagnóstica (“gray zone”) do ΔPP foram empregadas para avaliar a habilidade do índice preditivo em discriminar os respondedores ao último desafio volêmico. A fluido-reponsividade foi definida por diferentes porcentagens de aumento no índice de volume sistólico (IVS) mensurado pela técnica de termodiluição transpulmonar (IVS>10%, IVS>15%, IVS>20% e IVS>25%). Resultados – O número de respondedores ao último desafio volêmico foi de 25 (IVS>10%), 21 (IVS>15%), 18 (IVS>20%) e 14 (IVS>25%). A á... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Objective – To determine the accuracy of pulse pressure variation (PPV) measured from a peripheral artery to predict different percent increases in stroke volume induced by a fluid challenge in dogs. Methods – Were included 39 adult bitches (19.3 ± 3.6 kg) undergoing ovariohysterectomy. Anesthesia was maintained with isoflurane under volumecontrolled ventilation (tidal volume 12 mL kg-1 ; inspiratory pause during 40% of inspiratory time; inspiration:expiration ratio 1:1.5). Cardiac output was obtained by transpulmonary thermodilution (femoral artery catheter) and PPV was measured from a dorsal pedal artery catheter. Fluid responsiveness (FR) was evaluated by a fluid challenge with lactated Ringer´s solution (LRS, 20 mL kg-1 over 15 minutes) administered once (n = 21) or twice (n = 18) before surgery. Receiver operating characteristics (ROC) curve analysis and the zone of diagnostic uncertainty (gray zone) of PPV cutoff thresholds were employed to evaluate the ability of PPV to discriminate responders to the last fluid challenge, defined by different percentage increases in stroke volume index (SVI) measured by transpulmonary thermodilution (SVI>10% to SVI> 25%, with 5 % increments). Results – Number of responders to the last fluid challenge were 25 (SVI>10%), 21 (SVI>15%), 18 (SVI>20%), and 14 (SVI>25%). The area under the ROC curve (AUROC) of PPV was 0.897 (SVI>10%), 0.968 (SVI>15%), 0.923 (SVI>20%), and 0.891 (SVI>25%) (p <0.0001 from AUROC = 0.5). Gray zones of PPV cutoff ... (Complete abstract click electronic access below) / Mestre
29

“Monothermal caloric screening test performance: A relative operating characteristic (ROC) curve analysis

Murnane, Owen D., Akin, Faith W., Lynn, S. D., Cyr, D. G. 01 January 2009 (has links)
No description available.
30

Trajectories of Pure and Co-Occurring Internalizing and Externalizing Problems from Age 2 to Age 12: Findings from the NICHD Study of Early Child Care

Fanti, Kostas Andrea 03 May 2007 (has links)
According to previous research, internalizing and externalizing problems tend to be comorbid or co-occur at different ages in development (Angold, Costello, & Erkanli, 1999). The question that this dissertation addresses is how and why internalizing and externalizing problems, two disorders that represent separate forms of psychopathology, co-occur in children. This is an important question for the developmental psychopathology perspective because an appreciation of the concept of co-occurrence is essential for explaining the development and taxonomy of internalizing and externalizing psychopathology, and for understanding the etiology and course of these symptoms (Achenbach, 1990). Attempts to explain co-occurrence have proposed that co-occurring psychopathology might represent distinct, meaningful syndromes (Angold & Costello, 1992; O’Connor et al., 1998), and in support of this idea, evidence of the existence of pure and co-occurring internalizing and externalizing problems has been found (Keiley et al., 2003). However, no previous study has identified heterogeneous developmental patterns of pure or combined internalizing and externalizing problems within a dynamic framework by taking trajectories of change into account. This dissertation uses data from the NICHD study of Early Child Care to explore the co-occurrence between internalizing and externalizing problems from age 2 to 12 with the use of Latent Class Growth Analysis. The sample included 1232 children (52% male). Different groups of children exhibiting low/normative, pure internalizing, pure externalizing, and co-occurring internalizing and externalizing problems across the 10 year period were identified. The higher risk groups deviated from the low/normative group in terms of antecedents, SES risk, medical risk, difficult temperament, and home environment. Moreover, children who exhibited pure moderate externalizing problems, and children who exhibited chronic externalizing problems, with and without co-occurring internalizing problems, engaged in more risky behaviors and were more likely to have friends who also engaged in risky behaviors. Furthermore, the pure chronic externalizing group and the groups scoring high on internalizing problems, with and without co-occurring externalizing problems, were more asocial with peers. Finally, children exhibiting chronic co-occurring externalizing and internalizing problems were more excluded by peers in comparison to the rest of the sample’s population.

Page generated in 0.076 seconds