961 |
Development of experimental and analysis methods to calibrate and validate super-resolution microscopy technologies / Développement de méthodes expérimentales et d'analyse pour calibrer et valider les technologies de microscopie de super-résolutionSalas, Desireé 27 November 2015 (has links)
Les méthodes de microscopie de super-résolution (SRM) telles que la microscopie PALM (photoactivated localization microscopy), STORM (stochastic optical reconstruction microscopy), BALM (binding-activated localization microscopy) et le DNA-PAINT, représentent un nouvel ensemble de techniques de microscopie optique qui permettent de surpasser la limite de diffraction ( > 200 nm dans le spectre visible). Ces méthodes sont basées sur la localisation de la fluorescence de molécules uniques, et peuvent atteindre des résolutions de l'ordre du nanomètres (~20 nm latéralement et 50 nm axialement). Les techniques SRM ont un large spectre d'applications dans les domaines de la biologie et de la biophysique, rendant possible l'accès à l'information tant dynamique que structurale de structures connues ou non, in vivo et in vitro. Beaucoup d'efforts ont été fournis durant la dernière décennie afin d'élargir le potentiel de ces méthodes en développant des méthodes de localisation à la fois plus précise et plus rapide, d'améliorer la photophysique des fluorophores, de développer des algorithmes pour obtenir une information quantitative et augmenter la précision de localisation, etc. Cependant, très peu de méthodes ont été développées pour examiner l'hétérogénéité des images et extraire les informations statistiquement pertinent à partir de plusieurs milliers d'images individuelles super-résolues. Dans mon travail de thèse, je me suis spécifiquement attaquée à ces limitations en: (1) construisant des objets de dimensions nanométriques et de structures bien définies, avec la possibilité d'être adaptés aux besoins. Ces objets sont basés sur les origamis d'ADN. (2) développant des approches de marquage afin d'acquérir des images homogènes de ces objets. (3) implémentant des outils statistiques dans le but d'améliorer l'analyse et la validation d'images. Ces outils se basent sur des méthodes de reconstruction de molécules uniques communément appliquées aux reconstructions d'images de microscopie électronique. J'ai spécifiquement appliqué ces développements à la reconstruction de formes 3D de deux origamis d'ADN modèles (en une et trois dimensions). Je montre comment ces méthodes permettent la dissection de l'hétérogénéité de l'échantillon, et la combinaison d'images similaires afin d'améliorer le rapport signal sur bruit. La combinaison de différentes classes moyennes ont permis la reconstruction des formes tridimensionnelles des origamis d'ADN. Particulièrement, car cette méthode utilise la projection 2D de différentes vues d'une même structure, elle permet la récupération de résolutions isotropes en trois dimensions. Des fonctions spécifiques ont été adaptées à partir de méthodologies existantes afin de quantifier la fiabilité des reconstructions et de leur résolution. A l'avenir, ces développements seront utiles pour la reconstruction 3D de tous types d'objets biologiques pouvant être observés à haute résolution par des méthodologies dérivées de PALM, STORM ou PAINT. / Super resolution microscopy (SRM) methods such as photoactivated localization microscopy (PALM), stochastic optical reconstruction microscopy (STORM), binding-activated localization microscopy (BALM) and DNA-PAINT represent a new collection of light microscopy techniques that allow to overpass the diffraction limit barrier ( > 200 nm in the visible spectrum). These methods are based on the localization of bursts of fluorescence from single fluorophores, and can reach nanometer resolutions (~20 nm in lateral and 50 nm in axial direction, respectively). SRM techniques have a broad spectrum of applications in the field of biology and biophysics, allowing access to structural and dynamical information of known and unknown biological structures in vivo and in vitro. Many efforts have been made over the last decade to increase the potential of these methods by developing more precise and faster localization techniques, to improve fluorophore photophysics, to develop algorithms to obtain quantitative information and increase localization precision, etc. However, very few methods have been developed to dissect image heterogeneity and to extract statistically relevant information from thousands of individual super-resolved images. In my thesis, I specifically tackled these limitations by: (1) constructing objects with nanometer dimensions and well-defined structures with the possibility of be tailored to any need. These objects are based on DNA origami. (2) developing labeling approaches to homogeneously image these objects. These approaches are based on adaptations of BALM and DNA-PAINT microscopies. (3) implemented statistical tools to improve image analysis and validation. These tools are based on single-particle reconstruction methods commonly applied to image reconstruction in electron microscopy.I specifically applied these developments to reconstruct the 3D shape of two model DNA origami (in one and three dimensions). I show how this method permits the dissection of sample heterogeneity, and the combination of similar images in order to improve the signal-to-noise ratio. The combination of different average classes permitted the reconstruction of the three dimensional shape of DNA origami. Notably, because this method uses the 2D projections of different views of the same structure, it permits the recovery of isotropic resolutions in three dimensions. Specific functions were adapted from previous methodologies to quantify the reliability of the reconstructions and their resolution.In future, these developments will be helpful for the 3D reconstruction of any biological object that can be imaged at super resolution by PALM, STORM or PAINT-derived methodologies.
|
962 |
Painting in the twenty-tens;where to now? : (You can’t touch this!)Olofsson, Max January 2012 (has links)
The essay is a manifesto-like personal take on painting, and a redefinition of painting in the digital age. Careless usage of the term ”painting” has led to a diluted descriptive function and a waning categorizing capacity; almost anything can be called painting, which in turn puts actual painting in an awkward position – where it, apart from being itself, could be almost anything. The term “painthing” is introduced to distinguish painting from works that beside its two-dimensional visual information also makes a point of its specific materiality. It brings up cave paintings and links to video-games, suggesting that video-games have gone through the reversed evolution of the history of painting – from abstraction to representation. It speaks of the problems of documentation – the translation of visual information (or re-flattening of a flat surface) – and the cultural equalization of information and images on the internet through the common denominator the pixel. It also describes “information painting”, which in short is digital painting where there is no physical object to be translated to a documentation of itself, but rather a painting that is original in its documentation form (its digital form), painting that strives to be nothing but the utopia of an image – the untouchable/unreachable visual information.
|
963 |
The Fear of Mrs. Bates : The Use of Psychoanalytical Aspects, Anticipation and Retrospection in Robert Bloch’s PsychoSpolander, Rebecca January 2018 (has links)
This essay focuses on psychoanalytical notions in Robert Bloch’s novel Psycho. The theoretical framework is based on Sigmund Freud’s theory of psychoanalysis. Slavoj Žižek’s idea that the house serves as a symbol of Freud’s concept of the Super-Ego, Ego and Id is presented and further developed. Moreover, it is exemplified how the idea of repression as a defense mechanism can be traced in the novel. It is then explained that repression is used as a tool for making the reader feel sympathy for Norman Bates. In addition, Wolfgang Iser’s reception theory is used to explain how Bloch uses gaps and pre-intentions in order to create anticipation and retrospection in the reader to produce suspense and horror. The intention is to prove that the attention to the psychological issues is what makes the monster of the novel more sympathetic and recognizable to us as readers. Thus, the result is that we position ourselves closer to the monster, which leaves us wondering if we could, due to our shared psychology, be monsters as well.
|
964 |
The Role of Cytoskeletal Morphology in the Nanoorganization of SynapseKaliyamoorthy, Venkatapathy January 2016 (has links) (PDF)
Synapse is the fundamental unit of synaptic transmission. Learning, memory and neurodegenerative diseases of the brain are attributed to the maintenance and alteration in synaptic connections. The efficiency for synaptic transmission depends on how well the post synapse receives the signals from the presynapse; this in turn depends on the receptors present in the post synaptic density (PSD). PSD is present in the post synapse right opposite to the neurotransmitter release site in presynapse (active zone) is an indispensable part of the synapse. The PSD is comprised of receptors and scaffold proteins, which is ultimately supported by the actin cytoskeleton of the dendritic spines. Cytoskeletal dynamics is shown to influence the structural plasticity of spine and also PSD, but how it regulates the dynamicity of the synaptic transmission is not completely understood. Here we studied the influence of actin depolymerisation on sub synaptic organization of an excitatory synapse. In order to study the organization of the synapse at molecular resolution, the conventional microscopy cannot be employed due to the limit of diffraction.
Super resolution microscopy circumvents this diffraction limitation. In this study we have used custom built fluorescence microscope with Total Internal Reflection Fluorescence (TIRF) modality to observe the nanometre sized structures inside spines of mouse hippocampal primary neurons. The setup was integrated with Metamorph imaging software for both operating the microscope and imaging acquisition purpose with a separate appropriate laser system. This setup was successful in achieving the lateral resolution of ~30nm and axial resolution of ~51nm. Over all we were able to observe the loss of spines and significant reduction in area of nanometer sized protein clusters in postsynaptic density with in the spines of latrunculin A treated mouse hippocampal primary neurons compared to the native neurons. Along with the morphological alterations in neurons we also observed the changes in nanoscale organization of few key molecules in the postsynaptic density.
|
965 |
Human-like Super Mario Play using Artificial Potential FieldsSatish, Likith Poovanna Kelapanda, Ethiraj, Vinay Sudha January 2012 (has links)
Artifi cial potential fi elds is a technique that use attractive and repelling forces to control e.g. robots, or non player characters in games. We show how this technique may be used in a controller for Super Mario in a way create a human-like playing style. By combining fi elds of progression, opponent avoidance and rewards, we get a controller that tries to collect the rewards and avoid the opponents at the same time as it is progressing towards the goal of the level. We use human test persons to improve the controller further by letting them make pair-wise comparisons with human play recordings, and use the feed-back to calibrate the bot for human-like play. / Student 1: Likith Poovanna Kelapanda Staish Mob: +46735542609 Student 2: Vinay Sudha Ethiraj Mob: +46736135683
|
966 |
Super PAC's : Hur ska de förstås?Berg Johansson, Emmelie, Palm, Jennifer January 2012 (has links)
Följande studie handlar om Super PAC’s och hur de ska förstås, utifrån teorierna pluralismen och elitismen. Studien ämnade finna svaret på denna fråga genom att studera amerikanska tidningsartiklar. Att det var amerikansk media som fick bidra med det empiriska materialet kom sig av att det är i USA som Super PAC’s verkar. Genom användandet av en kvalitativ innehållsanalys så söktes det efter meningsenheter i det insamlade empiriska materialet, därefter följde en kategorisering utefter de valda teoriernas grundantaganden. Resultatet lyfter fram det faktum att Super PAC’s kan förstås som både positivt och negativt för den politiska processen. Vilken syn individen har på fenomenet kan bero både på ens politiska tillhörighet och på vilket teoretiskt ramverk studien utgår ifrån. Super PAC’s är så pass komplext att det är många olika faktorer som spelar in i tolkningen av organisationerna.
|
967 |
Approximate Nearest Neighbour Field Computation and ApplicationsAvinash Ramakanth, S January 2014 (has links) (PDF)
Approximate Nearest-Neighbour Field (ANNF\ maps between two related images are commonly used by computer vision and graphics community for image editing, completion, retargetting and denoising. In this work we generalize ANNF computation to unrelated image pairs. For accurate ANNF map computation we propose Feature Match, in which the low-dimensional features approximate image patches along with global colour adaptation. Unlike existing approaches, the proposed algorithm does not assume any relation between image pairs and thus generalises ANNF maps to any unrelated image pairs. This generalization enables ANNF approach to handle a wider range of vision applications more efficiently. The following is a brief description of the applications developed using the proposed Feature Match framework.
The first application addresses the problem of detecting the optic disk from retinal images. The combination of ANNF maps and salient properties of optic disks leads to an efficient optic disk detector that does not require tedious training or parameter tuning. The proposed approach is evaluated on many publicly available datasets and an average detection accuracy of 99% is achieved with computation time of 0.2s per image. The second application aims to super-resolve a given synthetic image using a single source image as dictionary, avoiding the expensive training involved in conventional approaches. In the third application, we make use of ANNF maps to accurately propagate labels across video for segmenting video objects. The proposed approach outperforms the state-of-the-art on the widely used benchmark SegTrack dataset. In the fourth application, ANNF maps obtained between two consecutive frames of video are enhanced for estimating sub-pixel accurate optical flow, a critical step in many vision applications. Finally a summary of the framework for various possible applications like image encryption, scene segmentation etc. is provided.
|
968 |
Hydrophobic and superhydrophobic surfaces by means of atmospheric plasmas: synthesis and texturization of fluorinated materialsHubert, Julie 08 September 2014 (has links)
In this thesis, we focused on the understanding of the synthesis and texturization processes of hydrophobic and (super)hydrophobic fluorinated surfaces by atmospheric plasmas.<p><p>First, we focused on the surface modifications of a model surface, the polytetrafluoroethylene (PTFE), by the post-discharge of a radio-frequency plasma torch. The post-discharge used for the surface treatment was characterized by optical emission spectroscopy (OES) and mass spectrometry (MS) as a function of the gap (torch-sample distance), and the helium and oxygen flow rates. Mechanisms explaining the production and the consumption of the identified species (N2, N2+, He, O, OH, O2m, O2+, Hem) were proposed. <p><p>The surface treatment was then investigated as a function of the kinematic parameters (from the motion robot connected to the plasma torch) and the gas flow rates. Although no change in the surface composition was recorded, oxygen is required to increase the hydrophobicity of the PTFE by increasing its roughness, while a pure helium plasma leads to a smoothing of the surface. Based on complementary experiments focused on mass losses, wettability and topography measurements coupled to the detection of fluorinated species on an aluminium foil by XPS, we highlighted an anisotropic etching oriented vertically in depth as a function of the number of scans (associated to the treatment time). Atomic oxygen is assumed to be the species responsible for the preferential etching of the amorphous phase leading to the rough surface, while the highly energetic helium metastables and/or VUV are supposed to induce the higher mass loss recorded in a pure helium plasma.<p><p>The second part of this thesis was dedicated to the deposition and the texturization of fluorinated coatings in the dielectric barrier discharge (DBD). The effects of the nature of the precursor (C6F12 and C6F14), the nature of the carrier gas (argon and helium), the plasma power, and the precursor flow rate were investigated in terms of chemical composition, wettability, topography and crystallinity by SIMS, XPS, WCA, AFM and XRD. We showed that hydrophobic surfaces with water contact angles (WCA) higher than 115° were obtained only in the presence of argon and were assumed to be due to the roughness created by the micro-discharges. Plasma-polymerized films in helium were smooth and no WCA higher than 115° was observed. We also studied the impact of the deposition rate and the layer thickness in the hydrophobic properties as well as the polymerization processes through the gas phase characterization.<p> / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
|
969 |
Low complexity turbo equalization using superstructuresMyburgh, Hermanus Carel January 2013 (has links)
In a wireless communication system the transmitted information is subjected to a number of impairments,
among which inter-symbol interference (ISI), thermal noise and fading are the most prevalent.
Owing to the dispersive nature of the communication channel, ISI results from the arrival of multiple
delayed copies of the transmitted signal at the receiver. Thermal noise is caused by the random
fluctuation on electrons in the receiver hardware, while fading is the result of constructive and destructive
interference, as well as absorption during transmission. To protect the source information,
error-correction coding (ECC) is performed in the transmitter, after which the coded information is
interleaved in order to separate the information to be transmitted temporally.
Turbo equalization (TE) is a technique whereby equalization (to correct ISI) and decoding (to correct
errors) are iteratively performed by iteratively exchanging extrinsic information formed by optimal
posterior probabilistic information produced by each algorithm. The extrinsic information determined
from the decoder output is used as prior information by the equalizer, and vice versa, allowing for
the bit-error rate (BER) performance to be improved with each iteration. Turbo equalization achieves
excellent BER performance, but its computational complexity grows exponentially with an increase in
channel memory as well as with encoder memory, and can therefore not be used in dispersive channels
where the channel memory is large. A number of low complexity equalizers have consequently been developed to replace the maximum a posteriori probability (MAP) equalizer in order to reduce the
complexity. Some of the resulting low complexity turbo equalizers achieve performance comparable
to that of a conventional turbo equalizer that uses a MAP equalizer. In other cases the low complexity
turbo equalizers perform much worse than the corresponding conventional turbo equalizer (CTE)
because of suboptimal equalization and the inability of the low complexity equalizers to utilize the
extrinsic information effectively as prior information.
In this thesis the author develops two novel iterative low complexity turbo equalizers. The turbo equalization
problem is modeled on superstructures, where, in the context of this thesis, a superstructure
performs the task of the equalizer and the decoder. The resulting low complexity turbo equalizers
process all the available information as a whole, so there is no exchange of extrinsic information
between different subunits. The first is modeled on a dynamic Bayesian network (DBN) modeling
the Turbo Equalization problem as a quasi-directed acyclic graph, by allowing a dominant connection
between the observed variables and their corresponding hidden variables, as well as weak connections
between the observed variables and past and future hidden variables. The resulting turbo equalizer is
named the dynamic Bayesian network turbo equalizer (DBN-TE). The second low complexity turbo
equalizer developed in this thesis is modeled on a Hopfield neural network, and is named the Hopfield
neural network turbo equalizer (HNN-TE). The HNN-TE is an amalgamation of the HNN maximum
likelihood sequence estimation (MLSE) equalizer, developed previously by this author, and an HNN
MLSE decoder derived from a single codeword HNN decoder. Both the low complexity turbo equalizers
developed in this thesis are able to jointly and iteratively equalize and decode coded, randomly interleaved information transmitted through highly dispersive multipath channels. The performance of both these low complexity turbo equalizers is comparable to that of the conventional
turbo equalizer while their computational complexities are superior for channels with long
memory. Their performance is also comparable to that of other low complexity turbo equalizers, but
their computational complexities are worse. The computational complexity of both the DBN-TE and
the HNN-TE is approximately quadratic at best (and cubic at worst) in the transmitted data block
length, exponential in the encoder constraint length and approximately independent of the channel
memory length. The approximate quadratic complexity of both the DBN-TE and the HNN-TE is
mostly due to interleaver mitigation, requiring matrix multiplication, where the matrices have dimensions
equal to the data block length, without which turbo equalization using superstructures is
impossible for systems employing random interleavers. / Thesis (PhD)--University of Pretoria, 2013. / gm2013 / Electrical, Electronic and Computer Engineering / unrestricted
|
970 |
The current state of injury related care for Malawi super league football playersChapweteka, Isaac January 2014 (has links)
>Magister Scientiae - MSc / The study aimed at identifying the current state of injury related care for Malawi super league football players. To achieve this the study determined the average time taken by soccer players in Malawi to return to active participation following an injury, identified the type of treatment received by football players, determined the management of football injuries by team doctors in Malawi, established the responsibilities of football coaches in the management of injuries in Malawi and established the financial and medical support received by football players after sustaining an injury
|
Page generated in 0.0478 seconds