• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 157
  • 37
  • 21
  • 10
  • 9
  • 6
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 298
  • 298
  • 86
  • 58
  • 57
  • 56
  • 48
  • 41
  • 39
  • 38
  • 36
  • 31
  • 28
  • 26
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Radiation Dose Optimization For Critical Organs

January 2013 (has links)
abstract: Ionizing radiation used in the patient diagnosis or therapy has negative effects on the patient body in short term and long term depending on the amount of exposure. More than 700,000 examinations are everyday performed on Interventional Radiology modalities [1], however; there is no patient-centric information available to the patient or the Quality Assurance for the amount of organ dose received. In this study, we are exploring the methodologies to systematically reduce the absorbed radiation dose in the Fluoroscopically Guided Interventional Radiology procedures. In the first part of this study, we developed a mathematical model which determines a set of geometry settings for the equipment and a level for the energy during a patient exam. The goal is to minimize the amount of absorbed dose in the critical organs while maintaining image quality required for the diagnosis. The model is a large-scale mixed integer program. We performed polyhedral analysis and derived several sets of strong inequalities to improve the computational speed and quality of the solution. Results present the amount of absorbed dose in the critical organ can be reduced up to 99% for a specific set of angles. In the second part, we apply an approximate gradient method to simultaneously optimize angle and table location while minimizing dose in the critical organs with respect to the image quality. In each iteration, we solve a sub-problem as a MIP to determine the radiation field size and corresponding X-ray tube energy. In the computational experiments, results show further reduction (up to 80%) of the absorbed dose in compare with previous method. Last, there are uncertainties in the medical procedures resulting imprecision of the absorbed dose. We propose a robust formulation to hedge from the worst case absorbed dose while ensuring feasibility. In this part, we investigate a robust approach for the organ motions within a radiology procedure. We minimize the absorbed dose for the critical organs across all input data scenarios which are corresponding to the positioning and size of the organs. The computational results indicate up to 26% increase in the absorbed dose calculated for the robust approach which ensures the feasibility across scenarios. / Dissertation/Thesis / Ph.D. Industrial Engineering 2013
62

Avaliação da qualidade da imagem e taxa de exposição na cardiologia intervencionista / Evaluation of Image Quality and Exposure Rate in Interventional Cardiology.

Roberto Contreras Pitorri 14 October 2013 (has links)
A Fluoroscopia é uma técnica de obtenção de imagens de raios X, através de um detector de imagens dinâmicas, que permite o acompanhamento de exames de órgãos em tempo real. Os detectores atualmente utilizados são os intensificadores de imagem (II) e os Flat Panel (FP), os primeiros (do tipo válvula) tem a principal função de aumentar o brilho na imagem e mais recentemente os segundos (de estado sólido), também sido utilizados nos equipamentos de fluoroscopia para melhorar a qualidade da imagem (contraste e detalhe), diminuindo ruídos e artefatos na mesma. Os exames gerais que utilizam a técnica de fluoroscopia servem para cabeça, tórax e abdômen, que antes eram realizados com um mesmo tipo de equipamento, mas devido a evolução da tecnologia equipamentos dedicados à esses exames passaram a ser utilizados. O objetivo deste trabalho é analisar especificamente um grupo de equipamentos de fluoroscopia cardíaca (de diferentes instituições e fabricantes) para inferir como estão os parâmetros de contraste, detalhe e taxa de exposição no detector (também interessante para os serviços de manutenção) em relação a sua média e também como estão em relação às referências internacionais. Para tal, foi desenvolvido um objeto simulador e um protocolo de testes incluindo medidas de taxa de exposição e análises dos parâmetros obtidos, a saber: a) testes preliminares para aceitação do equipamento para a amostragem, b) de detalhe e de contraste (utilizando o objeto simulador desenvolvido) para a obtenção do seu produto denominado por FOM (Figure of Merit), c) as medidas de taxa de exposição que chega no detector e d) as análises das distribuições dos resultados obtidos com os dois grupos de detectores, quanto as suas médias e comparação dessas (equipamentos utilizados no Brasil) com os valores de referência da literatura internacional. Do trabalho realizado foi possível comprovar que o objeto simulador e o protocolo desenvolvido, juntos à metodologia aplicada, foram adequados para auxiliar no controle de qualidade dos equipamentos selecionados, vii classificando-os quanto aos potenciais de otimização de FOM e TEEDI. Os FOMs médios do II e do FP distam do FOM referência de 35,5 % e 35,0 % e as TEEDIs médias para os II e FP distam da TEDDI referência respectivamente de 13,8% e 24,9% Estes últimos deverão ser ajustados pela manutenção para trazê-los mais próximos das referências utilizadas nas distribuições obtidas. / Fluoroscopy is a X ray technique used to obtain images through a dynamic image detector or sensor that allows to follow the organs movements in real time exams. Nowadays the detectors used are the image intensifier (II) e the Flat Panel (FP). The first one type valve has the main function to enhance the image brightness and more recently was developed the second one, (solid state technology) which the main function is to enhance the image quality (contrast and detail) minimizing the noise and artifacts in itself. Head, thorax and abdomen are the body sections which the general fluoroscopy deals and that was performed with one only type of equipment. Actually these exams are performed with dedicated machines due to the technology evolution and several manufactures are responsible for theirs development and assembly in several continents (Americas, Europe and Asia). The scope of this work is to analize two groups of fluoroscopy equipments (II and FP detectors), dedicated only for cardiac fluoroscopy and from different institutions and manufacturers, in order to infer how the parameters of contrast, detail and exposure rate at the entrance of the detector are in a Quality Control that the maintenance service would be also interested, besides a medical physicist. With these results one could know how the cited groups (through their average results) would be doing related to others groups of equipments or an specific one and even to international references. With this purpose a PMMA simulator object (OS) was developed with an protocol derived from the literature that was composed of exposure rate in the entrance of the detector (TEEDI), tests related to the selection of the equipments to be part of theirs samples, tests of contrast and detail (using the OS) to obtain their product named by FOM (Figure of Merit) and with all results obtained, to analyse the two distribution groups through their averages and comparing them not only with themselves, but also with the references from abroad. With this work it was possible to confirm that the OS, as well the protocol developed together the methodology used, were adequate to perform the quality control of the selected ix equipment samples, classifying them related theirs optimization potentials of FOM and TEEDI. The average FOMs for II and FP are far from the reference by 35,5 % and 35,0 % respectively and the average TEEDIs for II and FP are far from the reference TEDDI respectively by 13,8% and 24,9%. These last one has to be adjusted for the maintenance service (mainly the FP one) in order to bring them more near to the reference used to obtain the distributions.
63

Avaliação da correção de atenuação e espalhamento em imagens SPECT em protocolo cerebral / Evaluation of Attenuation and Scattering Correction in SPECT images of a Cerebral Protocol

Thays Berretta Käsemodel 22 September 2014 (has links)
A tomografia computadorizada por emissão de fóton único (SPECT) é uma das modalidades de diagnóstico na Medicina Nuclear em que se detecta a radiação emitida por um radiofármaco previamente administrado ao paciente. Visto que osfótons emitidos sofrem interações com o corpo do paciente, fazem-se necessárias as correções de atenuação e de espalhamento a fim de melhor representar a distribuição do radiofármaco, e assimresultar em imagens mais precisas. O objetivo deste trabalho é avaliar os parâmetros anotados como padrão para reconstruções de imagens tomográficas e as correções de atenuação e de espalhamento em imagens SPECT do Hospital de Clínicas da Faculdade de Medicina da Universidade de São Paulo, por meio de análises qualitativas e quantitativas das imagens reconstruída a partir das aquisições tomográficas. Sob um protocolo de SPECT-CT cerebral modificado para duas janelas de aquisição, foram adquiridas imagens SPECT e SPECT-CT (BrightView XCT, Philips) utilizando fantomaJaszczak e reconstruídas pelos métodos FBP, MLEM e OSEM. Os resultados mostram que o método FBP apresenta imagens de baixa precisão devido à baixa SNR. A avaliação sugere o uso dos métodos iterativos MLEM e OSEM com correção de atenuação como método padrão de reconstrução de imagens de perfusão cerebral. De acordo com a avaliação de imagens do fantomaJaszczak e análise do contraste entre esfera fria ebackground,propõe-se análise observacional e avaliação das imagens clínicas reconstruídas pelo método OSEM com os parâmetros 3 iterações, 16 subsets, filtro Butterworth com frequência de corte 0,34 e potencia 1, como novos parâmetros padrão de reconstrução de imagens. / Single Photon Emission Computed Tomography (SPECT) is one of the diagnostic modalities in nuclear medicine, it detects the radiation emitted by a radioisotope previously administered to the patient. Since the photons undergo interactions with the patient\'s body,attenuationand scatteringcorrections are necessary in order to best represent the distribution of the radiopharmaceutical, and thus result in more accurate images. The aim of this study is to evaluate the standard parameters for tomographic imagesreconstruction, and attenuation and scatter corrections ofSPECT images, from Hospital das Clínicas da Faculdade de Medicina de RibeirãoPreto, Universidade de São Paulo, through qualitativeand quantitative analysis of the reconstructed image obtained from SPECT aquisitions. Though a modified to two windows of acquisition protocol for cerebral SPECT-CT, we acquired SPECT and SPECT-CT images (BrightView XCT, Philips) using phantom Jaszczak and the ones were reconstructed by FBP, MLEM and OSEM methods. The results show that the FBP method has poor image precision due to low SNR. The review suggests the use of iterative methods MLEM and OSEM with attenuation correction as a standard method of image reconstruction of cerebral perfusion. According to the images the phantom Jaszczak and contrast analysis between cold sphere and background, we propose observational analysis and evaluations of clinical images reconstructed by OSEM method with parameters 3 iterations, 16 subsets, Butterworth filter with cutoff frequency 0.34 and order 1, as newstandard parameters for image reconstruction parameters.
64

Dosimetria e qualidade de imagem em mamografia digital

XAVIER, Aline Carvalho da Silva 28 August 2015 (has links)
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2016-04-15T15:22:54Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) DISSERTAÇÃO Aline Carvalho da Silva Xavier.pdf: 7741739 bytes, checksum: a144ef1082bc63af0c0c529b5b83b059 (MD5) / Made available in DSpace on 2016-04-15T15:22:54Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) DISSERTAÇÃO Aline Carvalho da Silva Xavier.pdf: 7741739 bytes, checksum: a144ef1082bc63af0c0c529b5b83b059 (MD5) Previous issue date: 2015-08-28 / FACEPE / A mamografia é o método de diagnóstico mais eficaz na detecção precoce de doenças mamárias. Os sistemas digitais permitem obter radiografias com melhor qualidade de imagem, mas nem sempre estão associadas a protocolos de aquisição otimizados, resultando em doses absorvidas mais elevadas nas glândulas mamárias. Neste sentido, tornam-se pertinentes levantamentos dosimétricos e avaliações da proteção radiológica nas instituições que realizam mamografias. O presente trabalho tem como objetivo estimar a dose glandular média (DGM) em pacientes que realizaram mamografias em sistemas digitais de Recife, Brasil e avaliar a qualidade das imagens obtidas. Foram avaliados: um mamógrafo Siemens, modelo Mammomat 3000 Nova e leitora CR Carestream, modelo Kodak DirectView; dois mamógrafos digitais DR Lorad, modelo Hologic Selenia e um equipamento analógico (telafilme) Lorad M-IV com uma processadora de filmes Kodak, modelo X-Omat 2000. As DGMs foram estimadas a partir da metodologia descrita por Dance e colaboradores, em pacientes com idades entre 40 e 64 anos, com espessuras de mama comprimida que variaram entre 2 e 9 cm. Foram levadas em conta apenas as incidências de rotina (CCD, CCE, MLOD, MLOE). Ao todo, foram coletados os seguintes parâmetros de irradiação de 5200 mamografias: tensão (kV); carga (mAs); combinação anodo/filtro, bem como, a espessura da mama comprimida, força de compressão utilizada e idade da paciente. Em relação à qualidade da imagem, foram analisadas a função da transferência de modulação (MTF), a razão sinal-ruído (SNR) e a razão contraste-ruído (CNR) para diferentes espessuras de simuladores de mama, de acordo com o documento Humam Health Series 17, da Agência Internacional de Energia Atômica. Paralelamente, utilizando os mesmos critérios de avaliação, médicos radiologistas analisaram 50 imagens mamográficas. Os resultados mostraram que os valores das DGMs foram de: 2,70±0,50 mGy para o equipamento tela-filme, 3,91±0,72 mGy para o equipamento CR, 2,86±0,53 mGy para o DR1 e 3,83±0,70 mGy para o DR2. As espessuras médias das mamas foram, respectivamente: 4,82 cm; 6,73 cm; 6,02 cm; 6,11 cm. A força de compressão foi menor do que o indicado para os equipamentos digitais e maior para o tela-filme. Os valores de DGM obtidos com o simulador de mama foi menor que os obtidos para paciente com espessura de mama equivalente. O parâmetro SNR foi considerado adequado para todos os equipamentos. A razão contraste-ruído não foi satisfatória para mamas mais espessas, porém a DGM se mostrou bem abaixo do limite recomendado para este tipo de mama, fazendo-se necessário o ajuste no controle automático de exposição. A MTF foi satisfatória tanto para o equipamento CR quanto para o DR1. A avaliação das imagens clínicas identificou falhas no posicionamento, indicando a necessidade de treinamento para os operadores dos equipamentos de mamografia digital. De maneira geral, espera-se, com este estudo, despertar e educar os profissionais envolvidos sobre a importância dos conceitos de proteção radiológica em mamografias digitais. / The mammography is the most effective diagnostic method for breast disease detection. Digital systems improve image quality, but not always follow optimization protocols for image acquisition and it may result in higher doses absorbed by the mammary glands. Therefore, it is important to conduct radiological protection studies in patients. Thus, this study aims to estimate the mean glandular dose (MGD) in patients who underwent mammograms in digital systems in Recife, Brazil, and to analyze the image quality. This study was conducted with: one Siemens mammography unit and a Carestream CR reader, Kodak DirectView; two Lorad Hologic Selenia digital mammography DR, and one Lorad M-IV analogic equipment with a Kodak film processor, X-Omat 2000. In order to evaluate the MGD, tests were conducted in patients with ages between 40 and 64 years old and breast thickness between 2 and 9 cm. Only the routine incidents were taken into account (RCC, LCC, RMLO, LMLO). The irradiation parameters of 5200 mammograms were analyzed among these four equipment. were recorded (kV, mAs, anode/filter), as well as the thickness of the compressed breast and the patient’s age. The MGD was calculated using the methodology described by Dance et al. The image quality was assessed through the modulation transfer function (MTF), the signal to noise ratio (SNR) and contrast to noise ratio (CNR) for different thicknesses of phatom, according to Humam Health Series 17 document of the International Atomic Energy Agency. Furthermore, the quality of 50 mammograms were evaluated by radiologists. The results showed that the value of MGD was 2.70±0.50 mGy to screen-film equipment, 3.91±0.72 mGy for CR equipment, 2.86±0.53 mGy for the DR1 and 3.83±0.70 mGy for the DR2. The average thicknesses were, respectively: 4.82 cm; 6.73 cm; 6.02 cm; 6.11 cm. The compression force was smaller than indicated for the digital equipment and greater for screen-film. The MGD values obtained with the phantom were lower than those obtained for patients with breast equivalent thickness. The SNR was suitable for all equipment. In most thicker breasts, CNR was not satisfactory, but DGM was below the recommended range for this type of breast, allowing to conclude that an adjustment in automatic exposure control is needed to improve the CNR, even though MGD might increase. The MTF was satisfactory for both CR and DR1 equipment. The evaluation of clinical images showed failure in positioning which indicates lack of training of mammography technicians. In general this study expects to awake and educate professionals of this field about the importance of radiation protection concepts in digital mammograms.
65

Multi-Objective Heterogeneous Multi-Asset Collection Scheduling Optimization with High-Level Information Fusion

Muteba Kande, Joel 18 August 2021 (has links)
Surveillance of areas of interest through image acquisition is becoming increasingly essential for intelligence services. Several types of platforms equipped with sensors are used to collect good quality images of the areas to be monitored. The evolution of this field has different levels: some studies are only based on improving the quality of the images acquired through sensors, others on the efficiency of platforms such as satellites, aircraft and vessels which will navigate the areas of interest and yet others are based on the optimization of the trajectory of these platforms. Apart from these, intelligence organizations demonstrate an interest in carrying out such missions by sharing their resources. This thesis presents a framework whose main objective is to allow intelligence organizations to carry out their observation missions by pooling their platforms with other organizations having similar or geographically close targets. This framework will use Multi-Objective Optimization algorithms based on genetic algorithms to optimize such mission planning. Research on sensor fusion will be a key point to this thesis, researchers have proven that an image resulting from the fusion of two images from different sensors can provide more information compared to the original images. Given that the main goal for observation missions is to collect quality imagery, this work will also use High-Level Information Fusion to optimize mission planning based on image quality and fusion. The results of the experiments not only demonstrate the added value of this framework but also highlight its strengths (through performance metrics) as compared to other similar frameworks.
66

Multi-modality quality assessment for unconstrained biometric samples / Évaluation de la qualité multimodale pour des échantillons biométriques non soumis à des contraintes

Liu, Xinwei 22 June 2018 (has links)
L’objectif de ces travaux de recherche est d’étudier les méthodes d’évaluation de laqualité des images biométriques multimodales sur des échantillons acquis de manièrenon contrainte. De nombreuses s études ont noté l’importance de la qualité del’échantillon pour un système de reconnaissance ou un algorithme de comparaison,puisque la performance du système biométrique est intrinsèquement dépendant dela qualité des images de l’échantillon. Dès lors, la nécessité d’évaluer la qualitédes échantillons biométriques pour plusieurs modalités (empreintes digitales, iris,visage, etc.) est devenue primordiale notamment avec l’apparition de systèmesbiométriques multimodaux de haute précision.Après une introduction présentant un historique de la biométrie et des préceptesliés à la qualité des échantillons biométriques, nous présentons le concept d’évaluationde la qualité des échantillons pour plusieurs modalités. Les normes de qualitéISO / CEI récemment établies pour les empreintes digitales, l’iris et le visage sontprésentées. De plus, des approches d’évaluation de la qualité des échantillons conçuesspécifiquement pour les empreintes digitales avec et sans contact, pour l’iris(dont une image est capturée en proche infrarouge et dans le domaine visible),ainsi que le visage sont étudiées. Finalement, des techniques d’évaluation des performancesdes mesures de qualité des échantillons biométriques sont égalementétudiées.Sur la base des conclusions formulées suite à l’étude des solutions algorithmiques portant sur l’évaluation de la qualité des échantillons biométriques, nous proposonsun cadre commun pour l’évaluation de la qualité d’image biométrique pourplusieurs modalité. Après avoir étudié les attributs de qualité basés sur l’image parmodalité biométrique, nous examinons quelle intersection existe pour l’ensembledes modalités. Ensuite, nous sélectionnons et redéfinissons les attributs de qualitébasés sur l’image qui sont les plus importants afin de définir un cadre commun.Afin de relier ces attributs de qualité aux vrais échantillons biométriques,nous développons une nouvelle base de données de qualité d’image biométriquemulti-modalité qui contient des images échantillons de haute qualité et des imagesdégradées pour l’empreinte digitale acquise sans contact, l’iris (dont l’acquisitionest réalisée dans le spectre visible) et le visage. Les types de dégradation appliquéssont liés aux attributs de qualité qui sont communs aux diverses modalitéset qui sont basés sur l’image. Un autre aspect important du cadre commun proposéest la qualité de l’image et ses applications en biométrie. Nous avons d’abordintroduit et classifié les métriques de qualité d’image existantes, puis effectué unbref aperçu des métriques de qualité d’image sans référence, qui peuvent être appliquéespour l’évaluation de la qualité des échantillons biométriques. De plus, nousétudions comment les mesures de qualité d’image sans référence ont été utiliséespour l’évaluation de la qualité des empreintes digitales, de l’iris et des modalitésbiométriques du visage.Des expériences pour l’évaluation de la performance des métriques de qualitéd’image sans référence sur les images de visage et de l’iris sont effectuées. Lesrésultats expérimentaux indiquent qu’il existe plusieurs métriques qui peuventévaluer la qualité des échantillons biométriques de l’iris et du visage avec un fortcoefficient de correlation. La méthode obtenant les meilleurs résultats en termede performance est ré-entrainée sur des images d’empreintes digitales, ce qui permetd’augmenter significativement les performances du système de reconnaissancebiométrique.À travers le travail réalisé dans cette thèse, nous avons démontré l’applicabilitédes métriques de qualité d’image sans référence pour l’évaluation d’échantillonsbiométriques multi-modalité non contraints. / The aim of this research is to investigate multi-modality biometric image qualityassessment methods for unconstrained samples. Studies of biometrics noted thesignificance of sample quality for a recognition system or a comparison algorithmbecause the performance of the biometric system depends mainly on the qualityof the sample images. The need to assess the quality of multi-modality biometricsamples is increased with the requirement of a high accuracy multi-modalitybiometric systems.Following an introduction and background in biometrics and biometric samplequality, we introduce the concept of biometric sample quality assessment for multiplemodalities. Recently established ISO/IEC quality standards for fingerprint,iris, and face are presented. In addition, sample quality assessment approacheswhich are designed specific for contact-based and contactless fingerprint, nearinfrared-based iris and visible wavelength iris, as well as face are surveyed. Followingthe survey, approaches for the performance evaluation of biometric samplequality assessment methods are also investigated.Based on the knowledge gathered from the biometric sample quality assessmentchallenges, we propose a common framework for the assessment of multi-modalitybiometric image quality. We review the previous classification of image-basedquality attributes for a single biometric modality and investigate what are the commonimage-based attributes for multi-modality. Then we select and re-define themost important image-based quality attributes for the common framework. In order to link these quality attributes to the real biometric samples, we develop anew multi-modality biometric image quality database which has both high qualitysample images and degraded images for contactless fingerprint, visible wavelengthiris, and face modalities. The degradation types are based on the selected commonimage-based quality attributes. Another important aspect in the proposed commonframework is the image quality metrics and their applications in biometrics. Wefirst introduce and classify the existing image quality metrics and then conducteda brief survey of no-reference image quality metrics, which can be applied to biometricsample quality assessment. Plus, we investigate how no-reference imagequality metrics have been used for the quality assessment for fingerprint, iris, andface biometric modalities.The experiments for the performance evaluation of no-reference image qualitymetrics for visible wavelength face and iris modalities are conducted. The experimentalresults indicate that there are several no-reference image quality metricsthat can assess the quality of both iris and face biometric samples. Lastly, we optimizethe best metric by re-training it. The re-trained image quality metric canprovide better recognition performance than the original. Through the work carriedout in this thesis we have shown the applicability of no-reference image qualitymetrics for the assessment of unconstrained multi-modality biometric samples.
67

Iterative algorithms for fast, signal-to-noise ratio insensitive image restoration

Lie Chin Cheong, Patrick January 1987 (has links)
No description available.
68

A Comparative Image Quality Analysis between Multi-Slice Computed Tomography and Cone Beam Computed Tomography for Radiation Treatment Planning Purposes

Fentner, David A. 20 August 2013 (has links)
No description available.
69

Image Quality Analysis Using GLCM

Gadkari, Dhanashree 01 January 2004 (has links)
Gray level co-occurrence matrix has proven to be a powerful basis for use in texture classification. Various textural parameters calculated from the gray level co-occurrence matrix help understand the details about the overall image content. The aim of this research is to investigate the use of the gray level co-occurrence matrix technique as an absolute image quality metric. The underlying hypothesis is that image quality can be determined by a comparative process in which a sequence of images is compared to each other to determine the point of diminishing returns. An attempt is made to study whether the curve of image textural features versus image memory sizes can be used to decide the optimal image size. The approach used digitized images that were stored at several levels of compression. GLCM proves to be a good discriminator in studying different images however no such claim can be made for image quality. Hence the search for the best image quality metric continues.
70

Visual Optics: Astigmatism

Cox, Michael J. January 2010 (has links)
no

Page generated in 0.0683 seconds