Spelling suggestions: "subject:"square""
751 |
Improving figures of merit and expanding applications for inductively coupled plasma mass spectrometryFinley-Jones, Haley Joy 03 December 2010 (has links)
Although inductively coupled plasma mass spectrometry (ICP-MS) is generally considered a reliable analytical technique, increasing demands on its capabilities require continued research and improvements. ICP-MS is susceptible to both matrix effects and drift, leading to a decline in accuracy and precision. A number of techniques are routinely used to compensate for these issues. Internal standardization is one such solution that requires relatively simple sample preparation and yet offers the possibility of improving both accuracy and precision. In order to be effective, an optimal analyte/internal standard pair must be chosen. Traditionally, analyte/internal standard pairs are chosen based on similarities in mass and/or ionization potential. The present studies sought to develop a program that determined standards based on the minimization of analytical error. 102 masses were monitored over 27 perturbations, i.e., changes to sample matrix and operating parameters. The standard deviations of the analyte/internal standard ratios were then used as a measure of internal standard performance. A thorough statistical analysis was conducted to determine trends between a good analyte/internal standard pair and similarities in chemical property. Similarities in mass offered the strongest relationship to a good internal standard choice, although many exceptions existed. The program was then tested over time and multiple instrument optimizations as well as on a completely different ICP-MS instrument. Results of these tests suggest that the data originally collected for the prediction program is not instrument-specific and thus provided a broader base of useful applications.
Due to its unmatched sensitivity and multielement capabilities, ICP-MS is frequently utilized for biological samples. A more recent application, however, seeks to use ICPMS for the purpose of determining specific associations between metals and proteins. Such speciation requires a high resolution and reproducible separation prior to ICPMS analysis. Gel electrophoresis offers good separation and is well matched with the scanning properties of laser ablation sample introduction. The present study utilized native gel electrophoresis coupled with a uniquely modified electroblot system to improve sensitivity and to elucidate additional information. Chemically modified quartz fiber filters were successfully used as the transfer membrane to improve protein and metal capture efficiency. / text
|
752 |
Φασματοσκοπική μελέτη οφθαλμικών παθήσεων και ανίχνευση μορίων φαρμάκωνΣιδερούδη, Θεοχαρία 13 March 2009 (has links)
Η φασματοσκοπία Raman είναι τεχνική ανελαστικής σκέδασης φωτός, ικανή να ανιχνεύει και να χαρακτηρίζει μόρια σε ποικιλία υδατικών διαλυμάτων. Σκοπός της εργασίας είναι η ανάπτυξη μιας μη επεμβατικής, μη καταστροφικής φασματοσκοπικής μεθόδου για την ανίχνευση και τον ποσοτικό προσδιορισμό τόσο φαρμακευτικών ουσιών (π.χ αντιβιοτικών) όσο και φυσιολογικών ουσιών (π.χ γλυκόζη) στο υδατοειδές υγρό οφθαλμού.
Στο πλαίσιο της παρούσας εργασίας αναπτύχθηκε μια νέα γεωμετρική οπτική διάταξη για την καθοδήγηση της δέσμης του laser στον οφθαλμό, που προσαρμόζεται σε φασματοσκόπιο Raman με ανιχνευτή CCD, δίνει τη δυνατότητα επιλεκτικής συλλογής του σκεδαζόμενου φωτός, σαρώνοντας τον εμπρόσθιο θάλαμο, σε γεωμετρία σκέδασης 90 μοιρών.
Τα πειράματα πραγματοποιήθηκαν (α) σε χοιρινούς οφθαλμούς in-vitro, max 24 ώρες μετά τη θανάτωση των ζώων και την αφαίρεση του βολβού, μετά την έγχυση στον εμπρόσθιο θάλαμο μορίων κεφταζιδίμης, αμφοτερισίνης Β και γλυκόζης και (β) σε μοντέλο πρόσθιου θαλάμου (AAC) σε συνδυασμό με κερατοειδή χιτώνα χοιρινών οφθαλμών μετά την έγχυση μορίων κεφταζιδίμης, αμφοτερισίνης, θειικής αμικασίνης και σιπροφλοξασίνης. Επιπλέον, χρησιμοποιήθηκε χημειομετρικός αλγόριθμος μερικών ελαχίστων τετραγώνων (PLS) για να προβλέψει τη συγκέντρωση των αντιβιοτικών στο μοντέλο του πρόσθιου θαλάμου.
Με τον νεό αυτό σχεδιασμό αποφεύγεται η απευθείας έκθεση βασικών οφθαλμικών ιστών, όπως του φακού και του αμφιβληστροειδούς, στη δέσμη του laser, ενώ παράλληλα επιτυγχάνονται βέλτιστες συνθήκες συλλογής του σκεδαζόμενου φωτός βελτιώνοντας το λόγο σήματος/θορύβου των φασμάτων. Ανιχνεύτηκαν συγκεντρώσεις στην περιοχή της μέσης ανασταλτικής πυκνότητας για τα αντιβιοτικά τόσο στο υδατοειδές υγρό χοιρινών οφθαλμών όσο και στο μοντέλο του πρόσθιου θαλάμουֹ η γλυκόζη ανιχνεύτηκε σε συγκέντρωση κοντά στα παθολογικά επίπεδα των διαβητικών ασθενών. Με βάση και το σφάλμα RMS της ποσοτικής ανάλυσης PLS, προσδοκάται βάσιμα ότι η μέθοδος είναι δυνατό χρησιμοποιηθεί στον τομέα της οφθαλμολογίας για τη μελέτη της φαρμακοκινητικής καθώς και για την έγκαιρη διάγνωση ασθενειών (π.χ. σακχαρώδης διαβήτης). / Laser Raman spectroscopy is an inelastic light scattering technique able to characterize molecules in aqueous environments. The purpose of this work is to develop a non-contact and non-invasive spectroscopic method to identify and eventually quantify the presence of medicines (e.g. antibiotics) and physiological substances (e.g. glucose) in the aqueous humor of the eye.
Α new laser light delivery probe has been developed and adapted to a Raman spectroscopic system with the ability of favorable collection of the Raman light at 90o scattering geometry while scanning the anterior chamber of the eye.
The technique is applied both, to porcine eyes in-vitro, max. 24 hours after death and extraction, for ceftazidime, amphotericin B and glucose and to a commercially available artificial anterior chamber (AAC) fitted with corneas of porcine eyes for ceftazidime, amphotericin B, amikacin sulphate and ciprofloxacin. Finally, a PLS chemometric algorithm has been developed to predict the concentration of antibiotics in AAC.
This special illumination design gives the opportunity of reducing the direct exposure of the basic cordial ocular tissues, like lens and retina, to the laser beam, while at the same time an optimum collection of scattered light is accomplished. Concentrations close to the minimum inhibitory concentration (MIC) have been detected for antibiotics both in porcine eyes and AAC; the detection of glucose has been realized at concentrations close to the early pathological levels of patients with diabetes. Furthermore, the quantification of concentration of antibiotics in AAC is accomplished by a partial least-squares (PLS) chemometric regression algorithm and the RMS error of the validation procedure further emphasize the promising prospect of the application of the Raman spectroscopy to the Ophthalmology.
|
753 |
Curvelet imaging and processing : an overviewHerrmann, Felix J. January 2004 (has links)
In this paper an overview is given on the application of directional basis functions, known under the name Curvelets/Contourlets, to various aspects of seismic processing and imaging. Key concepts in the approach are the use of (i) that localize in both domains (e.g. space and angle); (ii) non-linear estimation, which corresponds to localized muting on the coefficients, possibly supplemented by constrained optimization (iii) invariance of the basis functions under the imaging operators. We will discuss applications that include multiple and ground roll removal; sparseness-constrained least-squares migration and the computation of 4-D difference cubes.
|
754 |
Singular Value Decomposition in Image Noise Filtering and ReconstructionWorkalemahu, Tsegaselassie 22 April 2008 (has links)
The Singular Value Decomposition (SVD) has many applications in image processing. The SVD can be used to restore a corrupted image by separating significant information from the noise in the image data set. This thesis outlines broad applications that address current problems in digital image processing. In conjunction with SVD filtering, image compression using the SVD is discussed, including the process of reconstructing or estimating a rank reduced matrix representing the compressed image. Numerical plots and error measurement calculations are used to compare results of the two SVD image restoration techniques, as well as SVD image compression. The filtering methods assume that the images have been degraded by the application of a blurring function and the addition of noise. Finally, we present numerical experiments for the SVD restoration and compression to evaluate our computation.
|
755 |
Method for Improving the Efficiency of Image Super-Resolution Algorithms Based on Kalman FiltersDobson, William Keith 01 December 2009 (has links)
The Kalman Filter has many applications in control and signal processing but may also be used to reconstruct a higher resolution image from a sequence of lower resolution images (or frames). If the sequence of low resolution frames is recorded by a moving camera or sensor, where the motion can be accurately modeled, then the Kalman filter may be used to update pixels within a higher resolution frame to achieve a more detailed result. This thesis outlines current methods of implementing this algorithm on a scene of interest and introduces possible improvements for the speed and efficiency of this method by use of block operations on the low resolution frames. The effects of noise on camera motion and various blur models are examined using experimental data to illustrate the differences between the methods discussed.
|
756 |
Data-driven estimation for Aalen's additive risk modelBoruvka, Audrey 02 August 2007 (has links)
The proportional hazards model developed by Cox (1972) is by far the most widely used method for regression analysis of censored survival data. Application of the Cox model to more general event history data has become possible through extensions using counting process theory (e.g., Andersen and Borgan (1985), Therneau and Grambsch (2000)). With its development based entirely on counting processes, Aalen’s additive risk model offers a flexible, nonparametric alternative. Ordinary least squares, weighted least squares and ridge regression have been proposed in the literature as estimation schemes for Aalen’s model (Aalen (1989), Huffer and McKeague (1991), Aalen et al. (2004)). This thesis develops data-driven parameter selection criteria for the weighted least squares and ridge estimators. Using simulated survival data, these new methods are evaluated against existing approaches. A survey of the literature on the additive risk model and a demonstration of its application to real data sets are also provided. / Thesis (Master, Mathematics & Statistics) -- Queen's University, 2007-07-18 22:13:13.243
|
757 |
Novel 3D Back Reconstruction using Stereo Digital CamerasKumar, Anish Unknown Date
No description available.
|
758 |
Second-order Least Squares Estimation in Generalized Linear Mixed ModelsLi, He 06 April 2011 (has links)
Maximum likelihood is an ubiquitous method used in the estimation of generalized linear mixed model (GLMM). However, the method entails computational difficulties and relies on the normality assumption for random effects. We propose a second-order least squares (SLS) estimator based on the first two marginal moments of the response variables. The proposed estimator is computationally feasible and requires less distributional assumptions than the maximum likelihood estimator. To overcome the numerical difficulties of minimizing an objective function that involves multiple integrals, a simulation-based SLS estimator is proposed. We show that the SLS estimators are consistent and asymptotically normally distributed under fairly general conditions in the framework of GLMM.
Missing data is almost inevitable in longitudinal studies. Problems arise if the missing data mechanism is related to the response process. This thesis develops the proposed estimators to deal with response data missing at random by either adapting the inverse probability weight method or applying the multiple imputation approach.
In practice, some of the covariates are not directly observed but are measured with error. It is well-known that simply substituting a proxy variable for the unobserved covariate in the model will generally lead to biased and inconsistent estimates. We propose the instrumental variable method for the consistent estimation of GLMM with covariate measurement error. The proposed approach does not need any parametric assumption on the distribution of the unknown covariates. This makes the method less restrictive than other methods that rely on either a parametric distribution of the covariates, or to estimate the distribution using some extra information.
In the presence of data outliers, it is a concern that the SLS estimators may be vulnerable due to the second-order moments. We investigated the robustness property of the SLS estimators using their influence functions. We showed that the proposed estimators have a bounded influence function and a redescending property so they are robust to outliers.
The finite sample performance and property of the SLS estimators are studied and compared with other popular estimators in the literature through simulation studies and real world data examples.
|
759 |
A multivariate approach to QSARHellberg, Sven January 1986 (has links)
Quantitative structure-activity relationships (OSAR) constitute empirical analogy models connecting chemical structure and biological activity. The analogy approach to QSAR assume that the factors important in the biological system also are contained in chemical model systems. The development of a QSAR can be divided into subproblems: 1. to quantify chemical structure in terms of latent variables expressing analogy, 2. to design test series of compounds, 3. to measure biological activity and 4. to construct a mathematical model connecting chemical structure and biological activity. In this thesis it is proposed that many possibly relevant descriptors should be considered simultaneously in order to efficiently capture the unknown factors inherent in the descriptors. The importance of multivariately and multipositionally varied test series is discussed. Multivariate projection methods such as PCA and PLS are shown to be appropriate far QSAR and to closely correspond to the analogy assumption. The multivariate analogy approach is applied to a beta- adrenergic agents, b haloalkanes, c halogenated ethyl methyl ethers and d four different families of peptides. / <p>Diss. (sammanfattning) Umeå : Umeå universitet, 1986, härtill 8 uppsatser</p> / digitalisering@umu
|
760 |
Analyse des objets 3D a plusieurs échelles: application à l'assemblage de formesMellado, Nicolas 06 December 2012 (has links) (PDF)
Depuis quelques années, l'évolution des techniques d'acquisition a entraîné une généralisation de l'utilisation d'objets 3D très dense, représentés par des nuages de points de plusieurs millions de sommets. Au vu de la complexité de ces données, il est souvent nécessaire de les analyser pour en extraire les structures les plus pertinentes, potentiellement définies à plusieurs échelles. Parmi les nombreuses méthodes traditionnellement utilisées pour analyser des signaux numériques, l'analyse dite scale-space est aujourd'hui un standard pour l'étude des courbes et des images. Cependant, son adaptation aux données 3D pose des problèmes d'instabilité et nécessite une information de connectivité, qui n'est pas directement définie dans les cas des nuages de points. Dans cette thèse, nous présentons une suite d'outils mathématiques pour l'analyse des objets 3D, sous le nom de Growing Least Squares (GLS). Nous proposons de représenter la géométrie décrite par un nuage de points via une primitive du second ordre ajustée par une minimisation aux moindres carrés, et cela à pour plusieurs échelles. Cette description est ensuite derivée analytiquement pour extraire de manière continue les structures les plus pertinentes à la fois en espace et en échelle. Nous montrons par plusieurs exemples et comparaisons que cette représentation et les outils associés définissent une solution efficace pour l'analyse des nuages de points à plusieurs échelles. Un défi intéressant est l'analyse d'objets 3D acquis dans le cadre de l'étude du patrimoine culturel. Dans cette thèse, nous nous étudions les données générées par l'acquisition des fragments des statues entourant par le passé le Phare d'Alexandrie, Septième Merveille du Monde. Plus précisément, nous nous intéressons au réassemblage d'objets fracturés en peu de fragments (une dizaine), mais avec de nombreuses parties manquantes ou fortement dégradées par l'action du temps. Nous proposons un formalisme pour la conception de systèmes d'assemblage virtuel semi-automatiques, permettant de combiner à la fois les connaissances des archéologues et la précision des algorithmes d'assemblage. Nous présentons deux systèmes basés sur cette conception, et nous montrons leur efficacité dans des cas concrets.
|
Page generated in 0.0364 seconds