• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 147
  • 45
  • 45
  • 45
  • 45
  • 45
  • 45
  • 41
  • 22
  • 14
  • 12
  • 12
  • 3
  • 3
  • 1
  • Tagged with
  • 375
  • 375
  • 93
  • 66
  • 59
  • 57
  • 56
  • 51
  • 47
  • 45
  • 45
  • 45
  • 42
  • 41
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Interação de laser com neurônios: óptica de tecidos e fotoneuromodulação da dor / Laser Neuron Interaction: Tissue Optics and Photoneuromodulation for pain

Sousa, Marcelo Victor Pires de 19 September 2014 (has links)
A Terapia com Laser de Baixa Intensidade (TLBI) pode ser utilizada para tratar dores agudas e crônicas. Entender as vias da dor e as interações de fótons com tecidos neurais possibilitará compreender melhor essas terapias. A nocicepção pode ser autocontrolada por analgésicos endógenos, opióides naturais que bloqueiam a liberação de neurotransmissores excitatórios e, portanto, fazem uma inibição sensorial total e inespecífica. Sabe-se que a TLBI pode estimular a liberação desses analgésicos endógenos e inibir temporariamente o transporte axonal em fibras de pequeno diâmetro. Conhecer a fluência luminosa no interior do tecido neural é essencial para avaliar as hipóteses descritas, no entanto, devido à complexidade dos tecidos biológicos, calcular (ou simular) as interações da luz com os tecidos é impraticável. Para superar esse problema, desenvolvemos experimentos para estimar a fluência de luz em aplicação transcraniana de laser e mapeamos as propriedades ópticas do encéfalo de rato. Foi possível diferenciar tecidos encefálicos por suas características de atenuação da intensidade luminosa e estimar a profundidade da penetração da luz que pode ser de mais de 20 mm. Encontramos redução da dor evocada por pressão, calor, frio ou inflamação após TLBI transcraniana em camundongos. Para esse estudo comportamental, desenvolvemos um equipamento para avaliar o limiar de dor por aplicação de pressão nos animais, que opera com melhor precisão que os instrumentos comerciais. Os mecanismos de fotoneuromodulação foram investigados por quantificação de Trifosfato de Adenosina (ATP), imunofluorescência e marcação com hematoxilina-eosina de tecidos encefálicos que foram visualizados por microscópio confocal. A iluminação transcraniana aumentou a produção de ATP e de Fosfatase ácida prostática (um analgésico endógeno) e reduziu a quantidade do neurotransmissor glutamato, responsável pela condução da informação nociceptiva. Por outro lado, não observamos alteração na concentração de tubulina, um dos constituintes do citoesqueleto, e a marcação com hematoxilina-eosina revelou que não houve dano ao tecido decorrente da iluminação. Os achados apresentados nesse estudo atestam a relevância e eficácia da fotoneuromodulação proveniente de iluminação transcraniana com laser de 808 nm para suprimir a nocicepção em camundongos. Esse estudo é pioneiro na elucidação dos mecanismos de ação da fotoneuromodulação da dor in vivo. / Low-level Light therapy (LLLT) can treat acute and chronic pain. Getting knowledge about the pain pathways and the photon-neuron interactions may contribute to a better understanding of this therapy. The nociception may be self-controlled by endogenous analgesics, natural opioids that block the release of excitatory neurotransmitters, and thus make a total and nonspecific sensory inhibition. It is known that LLLT can stimulate the release of these endogenous analgesics, so it temporarily inhibits axonal transport in small diameter fibers. The knowledge of light fluence through neural tissue is essential to test the discussed assumptions. However, due to the complexity of biological tissues, the calculation (or simulation) of light-tissue interactions is impractical. In order to overcome this problem, we developed an experimental setup to estimate the light fluence in a transcranial LLLT and to map the optical properties of rat brains. It was possible to differentiate cerebral tissues and to estimate the depth of light penetration, which is more than 20 mm. We found a decrease in pain evoked either by pressure, heat, cold or inflammation after transcranial LLLT in mice. For this animal behavioral study, we developed an equipment to assess the pain threshold with pressure stimulus, which operates with better precision than commercial instruments. The LLLT mechanisms were investigated by Adenosine triphosphate ATP quantification, immunofluorescence and haematoxylin and eosin staining of brain tissues which were imaged by confocal microscopy. The transcranial irradiation increased ATP and prostatic acid phosphatase (an endogenous analgesic) production and reduced the amount of glutamate, the neurotransmitter responsible for conducting nociceptive information. There was no change in the concentration of tubulin, a constituent of the cytoskeleton, and the haematoxylin and eosin staining revealed no tissue damage due to the irradiation. This study is pioneer in elucidating the mechanisms of in vivo photoneuromodulation of pain.
172

Eficácia em problemas inversos: generalização do algoritmo de recozimento simulado e função de regularização aplicados a tomografia de impedância elétrica e ao espectro de raios X / Efficiency in inverse problems: generalization of simulated annealing algorithm and regularization function applied to electrical impedance tomography and X-rays spectrum

Menin, Olavo Henrique 08 December 2014 (has links)
A modelagem de processos em física e engenharia frequentemente resulta em problemas inversos. Em geral, esses problemas apresentam difícil resolução, pois são classificados como mal-postos. Resolvê-los, tratando-os como problemas de otimização, requer a minimização de uma função objetivo, que mede a discrepância entre os dados experimentais e os obtidos pelo modelo teórico, somada a uma função de regularização. Na maioria dos problemas práticos, essa função objetivo é não-convexa e requer o uso de métodos de otimização estocásticos. Dentre eles, tem-se o algoritmo de recozimento simulado (Simulated Annealing), que é baseado em três pilares: i) distribuição de visitação no espaço de soluções; ii) critério de aceitação; e iii) controle da estocasticidade do processo. Aqui, propomos uma nova generalização do algoritmo de recozimento simulado e da função de regularização. No algoritmo de otimização, generalizamos o cronograma de resfriamento, que usualmente são considerados algébricos ou logarítmicos, e o critério de Metropolis. Com relação à função de regularização, unificamos as versões mais utilizadas, em uma única fórmula. O parâmetro de controle dessa generalização permite transitar continuamente entre as regularizações de Tikhonov e entrópica. Por meio de experimentos numéricos, aplicamos nosso algoritmo na resolução de dois importantes problemas inversos na área de Física Médica: a determinação do espectro de um feixe de raios X, a partir de sua curva de atenuação, e a reconstrução da imagem na tomografia de impedância elétrica. Os resultados mostram que o algoritmo de otimização proposto é eficiente e apresenta um regime ótimo de parâmetros, relacionados à divergência do segundo momento da distribuição de visitação. / Modeling of processes in Physics and Engineering frequently yields inverse problems. These problems are normally difficult to be solved since they are classified as ill-posed. Solving them as optimization problems require the minimization of an objective function which measures the difference between experimental and theoretical data, added to a regularization function. For most of practical inverse problems, this objective function is non-convex and needs a stochastic optimization method. Among them, we have Simulated Annealing algorithm, which is based on three fundamentals: i) visitation distribution in the search space; ii) acceptance criterium; and iii) control of process stochasticity. Here, we propose a new generalization of simulated annealing algorithm and of the regularization function. On the optimization algorithm, we have generalized both the cooling schedule, which usually is algebric or logarithmic, and the Metropolis acceptance criterium. Regarding to regularization function, we have unified the most used versions in an unique equation. The generalization control parameter allows exchange continuously between the Tikhonov and entropic regularization. Through numerical experiments, we applied our algorithm to solve two important inverse problems in Medical Physics: determination of a beam X-rays spectrum from its attenuation curve and the image reconstruction of electrical impedance tomography. Results show that the proposed algorithm is efficient and presents an optimal arrangement of parameters, associated to the divergence of the visitation distribution.
173

Luminescência opticamente estimulada com aplicações em radioterapia: dependência da dose absorvida e da energia de fótons produzidos em aceleradores clínicos e Microtron - IFUSP / Optically stimulated luminescence with applications in radiotherapy: dependence of absorbed dose and photon energy produced on clinical accelerators and Microtron - IFUSP

Freitas, Carlos Eduardo 28 April 2017 (has links)
Atualmente a utilização de fótons de raios X e gama na Medicina (radiodiagnóstico e radioterapia) gerou a necessidade do conhecimento da dose absorvida nos tecidos dos pacientes. Para isso, utiliza-se detectores com diversas funcionalidades a fim de conhecer a deposição de energia. Para ser possível a utilização destes materiais, é necessário caracterizá-los tanto quanto à resposta em função da dose absorvida, quanto à dependência com a energia. Neste trabalho foi estudada a dependência energética da resposta OSL dos dosímetros compostos por BeO e Al2O3 em feixes de raios X convencional com espectro largo, feixes de raios X de aceleradores clínicos e do acelerador Microtron do Instituto de Física da USP e fontes gama de Cs-137 e Co-60, em uma faixa de doses desde centenas de mGy até aproximadamente 2 Gy. Foram utilizadas em conjunto simulações em Monte Carlo com o código PENELOPE, que simula o transporte de fótons e elétrons na matéria, a fim de conhecer a deposição de energia nos materiais. Os espectros utilizados nas simulações foram validados através da comparação das características dos feixes tais como a deposição de dose com a profundidade, camada semirredutora, energia média e resolução espectral. Utilizando um sistema de varredura por câmara de ionização controlado remotamente, foi estudada a distribuição de doses no campo de irradiação para definir uma região de irradiação com dose uniforme para pode realizar a irradiação de um conjunto de dosímetros simultaneamente. Foram encontrados valores do fator de dependência energética experimental para os feixes mais energéticos (Co-60, Microtron 5 MeV e aceleradores clínicos de 6 e 15 MeV), com desvio menor que 3% para o Al2O3 e 5% para o BeO, indicando independência com a energia do feixe de irradiação. Já nos feixes de baixa energia, os fatores de dependência energética relativos à energia do Co-60 para o Al2O3 foram de 1,68 e 1,42, e, para o BeO, de 0,78 e 0,90 para os feixes W150 e W200 respectivamente. Estes valores indicam que, nesta faixa de energia, a resposta OSL depende mais fortemente da energia do feixe. As simulações em Monte Carlo feitas para os fatores de dependência energética se mostraram bem consistentes com os valores experimentais para o BeO em toda faixa de energia e para o Al2O3, excetuando-se os feixes de baixa energia, possivelmente pelo não completo conhecimento de todos os elementos constituintes do dosímetro empregado (fita de Luxel®). Quanto às curvas de emissão OSL para as diferentes energias de irradiação, foi observado que o sinal decai mais rápido quanto mais elevada é a energia média do feixe para o BeO. Para o Al2O3, o efeito foi mais sutil e oposto. Os estudos do feixe do Microtron mostraram que, desde que a corrente seja monitorada durante as irradiações para posterior correção da dose, é possível realizar estudos utilizando um feixe de fótons de Bremsstrahlung produzido com alvo externo e energias médias (simuladas e ainda não validadas experimentalmente) próximas às energias das fontes gama de Cs-137 e Co-60. Os resultados experimentais para as respostas OSL dos dois materiais foram compatíveis com as simulações. / Nowadays the use of X and gamma-ray photons in Medicine (radiodiology and radiotherapy) has generated the necessity of knowing the absorbed dose to the tissues of patients. For this purpose, detectors with diverse functionalities are used to know the energy deposition. To make possible the use of these materials, it is necessary to characterize the dose response, as well as the energy dependence of the response. In this work the energy dependence of the OSL response of the BeO and Al2O3 dosimeters was studied for conventional X-ray beams with wide energy spectrum, X-ray beams from clinical accelerators and from the Microtron accelerator of the Institute of Physics (University of São Paulo), and for gamma sources of Cs-137 and Co-60, in a dose range from hundreds of mGy to about 2 Gy. Monte Carlo simulations were used with the PENELOPE code, which simulates the photons and electrons transport of in the matter, to know the energy deposition in the materials. The energy spectra used in simulations were validated by comparison with beam characteristics such as depth dose deposition, half-value layer, mean energy and spectral resolution. Using a remotely controlled ionization chamber scanning system, the dose distribution within the irradiation field was evaluated to define a uniform dose region for irradiation or a set of dosimeters simultaneously. Experimental energy dependence factor values were obtained for the high-energy beams (Co-60, Microtron 5 MeV, and clinical accelerators of 6 and 15 MeV), with a deviation of less than 3% for Al2O3 and 5% for BeO, indicating independence with the beam energy. For the low-energy X-ray beams, the energy dependence factors, relative to the Co-60 energy, for Al2O3 were 1.68 and 1.42, and, for BeO, 0.78 and 0.90 for W150 and W200 beams respectively. These values indicate that, in this energy range, the OSL response depends on beam energy. Monte Carlo simulations for the energy dependence factors were consistent with experimental values for the BeO in entire energy range and for the Al2O3, with the exception of low energy beams, possibly due to the incomplete knowledge of all the elements that constitute the Luxel®. As for the shape of OSL emission curves for the different irradiation energies, it was observed that the signal decays faster the higher is the average beam energy for BeO. For Al2O3, the effect was more subtle and opposite. The Microtron beam scanning has shown that, as long as the current is monitored during irradiation for further dose-rate correction, it is possible to carry out studies using a beam of Bremsstrahlung photons produced with external target and mean energies (simulated but still not validated experimentally) close to the energies of the Cs-137 and Co-60 sources. The experimental results for the OSL responses of both materials were compatible with simulations.
174

Espectroscopia de raios X utilizando o espalhamento Compton / X-ray spectroscopy using compton scattering

Linke, Arlene 16 September 2008 (has links)
O estudo do espectro de raios X é importante para predizer a qualidade da imagem em sistemas radiográficos. Nesse trabalho usamos um detector semicondutor CdTe para medir os espectros de raios X na faixa de 25 a 120 kVp. Esse detector é pequeno e fácil de ser transportado. As medidas feitas diretamente do tubo de raios X não são aconselháveis devido à alta taxa de fluência que estão acima dos limites de detecção dos detectores usuais e podem causar armadilhamento de carga no detector CdTe. A alternativa é medir os fótons espalhados a 900 a partir de um pequeno objeto espalhador, no nosso caso um cilindro de lucite (PMMA). A partir do espectro espalhado, corrigido para tempo morto e eficiência, e calibrado em energia, efetuamos correções para o ângulo de espalhamento e espalhamento coerente (somente para baixas energias). Só então formalismo de Klein-Nishina para as seções de choque do efeito Compton é aplicado aos espectros, resultando nos espectros primários. Para energia de 120 kVp foi medido um espectro do feixe direto e comparado com o obtido através do espectro espalhado para mesma energia. Para o mesmo feixe, a camada semi-redutora foi avaliada experimentalmente e com o espectro obtido por espalhamento Compton, obtendo-se boa concordância. Observamos na curva de eficiência uma descontinuidade, prevista teoricamente e também observada por outros autores, que deforma o espectro corrigido. Foi aplicado um ajuste polinomial a essa curva, suavizando os espectros obtidos e não alterando sua forma. Os resultados foram satisfatórios e validaram o formalismo apresentado por outros autores, utilizando detectores distintos. / The analysis of x-ray spectra is important to predict image quality in radiographic systems. In this work we used a semiconductor CdTe detector to measure x-ray spectra with a CdTe detector, in the range from 25 to 120 kVp. The CdTe detector is small and portable. The measurements done directly in the primary X rays beam can be difficult because of the high photon fluence rates that cause significant detector photon pile up, and can cause charge trapping in the CdTe. The alternative is to measure the photons scattered at 900 by a small scattering object, in our case a Lucite cylinder (PMMA). Starting from the scattered spectrum, corrected to dead time and efficiency and calibrated in energy, we made corrections for the scattering angle and coherent scattering (only at low energies). After that, the Klein Nishina cross section for Compton scattering are applied to the spectra, resulting in the primary spectra. For the 120 kVp beam a spectrum of the direct beam was measured and compared to the one obtained it through the scattered spectrum for the same energy. The half value layer of the same spectrum was determined experimentally and through the calculated spectrum, and the agreement was very good. The efficiency curve presents a discontinuity, also observed by other authors and predicted theoretically, that deforms the corrected spectrum. A polynomial fitting was adjusted to the efficiency curve, smoothing the corrected spectra without and not altering its form. The results were satisfactory and they validated the methodology presented by other authors, using different detectors.
175

Incorporação do espalhamento Compton no modelo de TBC modificado / COMPTON SCATTERING INCORPORATION ON MODIFIED TBC MODEL

Torres, Daniel Cruz 08 October 2015 (has links)
No último século, houve grande avanço no entendimento das interações das radiações com a matéria. Essa compreensão se faz necessária para diversas aplicações, entre elas o uso de raios X no diagnóstico por imagens. Neste caso, imagens são formadas pelo contraste resultante da diferença na atenuação dos raios X pelos diferentes tecidos do corpo. Entretanto, algumas das interações dos raios X com a matéria podem levar à redução da qualidade destas imagens, como é o caso dos fenômenos de espalhamento. Muitas abordagens foram propostas para estimar a distribuição espectral de fótons espalhados por uma barreira, ou seja, como no caso de um feixe de campo largo, ao atingir um plano detector, tais como modelos que utilizam métodos de Monte Carlo e modelos que utilizam aproximações analíticas. Supondo-se um espectro de um feixe primário que não interage com nenhum objeto após sua emissão pelo tubo de raios X, este espectro é, essencialmente representado pelos modelos propostos anteriormente. Contudo, considerando-se um feixe largo de radiação X, interagindo com um objeto, a radiação a ser detectada por um espectrômetro, passa a ser composta pelo feixe primário, atenuado pelo material adicionado, e uma fração de radiação espalhada. A soma destas duas contribuições passa a compor o feixe resultante. Esta soma do feixe primário atenuado, com o feixe de radiação espalhada, é o que se mede em um detector real na condição de feixe largo. O modelo proposto neste trabalho visa calcular o espectro de um tubo de raios X, em situação de feixe largo, o mais fidedigno possível ao que se medem em condições reais. Neste trabalho se propõe a discretização do volume de interação em pequenos elementos de volume, nos quais se calcula o espalhamento Compton, fazendo uso de um espectro de fótons gerado pelo Modelo de TBC, a equação de Klein-Nishina e considerações geométricas. Por fim, o espectro de fótons espalhados em cada elemento de volume é somado ao espalhamento dos demais elementos de volume, resultando no espectro total espalhado. O modelo proposto foi implementado em ambiente computacional MATLAB® e comparado com medições experimentais para sua validação. O modelo proposto foi capaz de produzir espectros espalhados em diferentes condições, apresentando boa conformidade com os valores medidos, tanto em termos quantitativos, nas quais a diferença entre kerma no ar calculado e kerma no ar medido é menor que 10%, quanto qualitativos, com fatores de mérito superiores a 90%. / The understanding of the interactions between radiation and matter advanced considerably in the last century. This understanding was needed by several applications, such as the use of X-rays in diagnostic imaging. In diagnostic applications, the image is created by the contrast resulting from the X-ray attenuation by the different body tissues. However, some interactions between the X-rays with the matter may reduce the quality of the images obtained in diagnostic imaging, as is the case of the scattering phenomenon. There are several modeling approaches to estimate the spectral distribution of photons scattered through a barrier, as in the case of a broad beam hitting a spectrometer detector. For instance, there are approaches that use Monte Carlo methods and approaches that use analytical approximations. Assuming a primary spectrum that does not interact with any object after its issuance by the X -ray tube, this spectrum is essentially represented by the previously proposed models. However, considering a broad beam of X-rays interacting with an object, the radiation to be detected by a spectrometer is now composed of the primary beam attenuated by the added material, and a scattered radiation fraction. The sum of these two contributions becomes part of the resulting beam. This sum of attenuated primary beam with the scattered radiation beam is what is measured in a real detector in broad beam condition. The model proposed in this work aims to simulate the spectrum of an x-ray tube in wide beam situation, the most reliable possible to what is measured in real conditions. In this work we propose the discretisation of the volume of interaction into small volume elements, which are used to calculate the Compton scattering. The spectrum of the photon spreading in each volume element is added to other volume elements, resulting in the spectrum of the whole barrier. The proposed model was implemented in MATLAB®, a computational environment. We evaluate the model by comparing the computational results with results from physical experiments. The model we propose was capable of creating accurate distribution of the spectrum spreading, under different conditions and in different experiments. The model results were close to the results obtained by experimental evaluation, both quantitatively, such as the difference smaller than 10% between the simulated air kerma and the measured air kerma obtained in the experimental evaluation, and qualitatively.
176

New method of collecting output factors for commissioming linear accelerators with special emphasis on small fields and intensity modualted readiation therapy

Unknown Date (has links)
Common methods for commissioning linear accelerators often neglect beam data for small fields. Examining the methods of beam data collection and modeling for commissioning linear accelerators revealed little to no discussion of the protocols for fields smaller than 4 cm x 4 cm. This leads to decreased confidence levels in the dose calculations and associated monitor units (MUs) for Intensity Modulated Radiation Therapy (IMRT). The parameters of commissioning the Novalis linear accelerator (linac) on the Eclipse Treatment Planning System (TPS) led to the study of challenges collecting data for very small fields. The focus of this thesis is the examination of the protocols for output factor collection and their impact on dose calculations by the TPS for IMRT treatment plans. Improving output factor collection methods, led to significant improvement in absolute dose calculations which correlated with the complexity of the plans. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
177

Sparse Modeling Applied to Patient Identification for Safety in Medical Physics Applications

Unknown Date (has links)
Every scheduled treatment at a radiation therapy clinic involves a series of safety protocol to ensure the utmost patient care. Despite safety protocol, on a rare occasion an entirely preventable medical event, an accident, may occur. Delivering a treatment plan to the wrong patient is preventable, yet still is a clinically documented error. This research describes a computational method to identify patients with a novel machine learning technique to combat misadministration.The patient identification program stores face and fingerprint data for each patient. New, unlabeled data from those patients are categorized according to the library. The categorization of data by this face-fingerprint detector is accomplished with new machine learning algorithms based on Sparse Modeling that have already begun transforming the foundation of Computer Vision. Previous patient recognition software required special subroutines for faces and di↵erent tailored subroutines for fingerprints. In this research, the same exact model is used for both fingerprints and faces, without any additional subroutines and even without adjusting the two hyperparameters. Sparse modeling is a powerful tool, already shown utility in the areas of super-resolution, denoising, inpainting, demosaicing, and sub-nyquist sampling, i.e. compressed sensing. Sparse Modeling is possible because natural images are inherrently sparse in some bases, due to their inherrant structure. This research chooses datasets of face and fingerprint images to test the patient identification model. The model stores the images of each dataset as a basis (library). One image at a time is removed from the library, and is classified by a sparse code in terms of the remaining library. The Locally Competetive Algorithm, a truly neural inspired Artificial Neural Network, solves the computationally difficult task of finding the sparse code for the test image. The components of the sparse representation vector are summed by `1 pooling, and correct patient identification is consistently achieved 100% over 1000 trials, when either the face data or fingerprint data are implemented as a classification basis. The algorithm gets 100% classification when faces and fingerprints are concatenated into multimodal datasets. This suggests that 100% patient identification will be achievable in the clinal setting. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2016. / FAU Electronic Theses and Dissertations Collection
178

Dosimetric comparison of inverse planning by simulated annealing (IPSA) and dose points optimized treatment plans in high dose rate (HDR) brachytherapy of skin lesions using Freiburg flap applicator

Unknown Date (has links)
A detailed dosimetric comparison between Inverse Planning by Simulated Annealing (IPSA) and Dose Points (DP) optimized treatment plans has been performed for High Dose Rate (HDR) brachytherapy of skin lesions using Freiburg Flap applicator in order to find out whether or not IPSA offers better clinical dosimetric outcomes for lesions categorized into four different curvatures. Without compromising target coverage, IPSA reduced the volume of Planning Target Volume (lesion) that received at least 125% of the prescription dose on average by 41%. It also reduced the volume of the healthy skin surrounding the lesion that receives at least 100% of the prescription dose on average by 42%. IPSA did not show any advantage over DP in sparing normal structures underlying the lesions treated. Although DP optimization algorithm has been regularly used at Lynn Cancer Institute for HDR brachytherapy of skin lesions, recent upgrades in IPSA software have made IPSA more amenable to rapid treatment planning and therefore IPSA can be used either in place of DP or as its alternative. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
179

A planar cable-driven robotic device for physical therapy assistance

Unknown Date (has links)
The design and construction of a tri-cable, planar robotic device for use in neurophysical rehabilitation is presented. The criteria for this system are based primarily on marketability factors, rather than ideal models or mathematical outcomes. The device is designed to be low cost and sufficiently safe for a somewhat disabled individual to use unsupervised at home, as well as in a therapist's office. The key features are the use of a barrier that inhibits the user from coming into contact with the cables as well as a "break-away" joystick that the user utilizes to perform the rehabilitation tasks. In addition, this device is portable, aesthetically acceptable and easy to operate. Other uses of this system include sports therapy, virtual reality and teleoperation of remote devices. / by Melissa M. Morris. / Includes a thesis demonstration video (QuickTImeMovie ; time [2:25] ; size [16.6MB] ; frame width [640] ; frame height [480]. / Thesis (M.S.C.S.)--Florida Atlantic University, 2007. / Includes bibliography.
180

A characterization of the LAP Aquarius Phantom for external LAP laser alignment and magnetic resonance geometric distortion verification for stereotactic radiation surgery patient simulation

Unknown Date (has links)
The Thesis explores additional applications of LAP's Aquarius external laser alignment verification Phantom by examining geometric accuracy of magnetic resonance images commonly used for planning intracranial stereotactic radiation surgery (ICSRS) cases. The scans were performed with MRI protocols used for ICSRS, and head and neck diagnosis, and their images fused to computerized tomographic (CT) images. The geometric distortions (GDs) were measured against the CT in all axial, sagittal, and coronal directions at different levels. Using the Aquarius Phantom, one is able to detect GD in ICSRS planning MRI acquisitions, and align the external LAP patient alignment lasers, by following the LAP QA protocol. GDs up to about 2 mm are observed at the distal regions of the longitudinal axis in the SRS treatment planning MR images. Based on the results, one may recommend the use of the Aquarius Phantom to determine if margins should be included for SRS treatment planning. / by Daniel Vergara. / Thesis (M.S.)--Florida Atlantic University, 2012. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2012. Mode of access: World Wide Web.

Page generated in 0.1006 seconds