• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 664
  • 207
  • 62
  • 60
  • 53
  • 45
  • 12
  • 11
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • Tagged with
  • 1325
  • 1325
  • 211
  • 205
  • 159
  • 140
  • 139
  • 131
  • 117
  • 116
  • 114
  • 110
  • 110
  • 108
  • 101
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
511

Quantification of regional cardiac function : clinically-motivated algorithm development and application to cardiac magnetic resonance and computed tomography

Vigneault, Davis Marc January 2017 (has links)
Techniques described to date for the reproducible and noninvasive quantification of regional cardiac function have been largely relegated to research settings due to time-consuming and cumbersome image acquisition and analysis. In this thesis, feature tracking algorithms are developed for 2-D+Time cardiac magnetic resonance (CMR) and 3-D+Time cardiac computed tomography (CCT) image sequences that are easily acquired clinically, while emphasising reproducibility and automation in their design. First, a commercially-implemented CMR feature tracking algorithm for the analysis of steady state free precession (SSFP) cine series is evaluated in patients with hypertrophic cardiomyopathy (HCM) and arrhythmogenic right ventricular cardiomyopathy (ARVC), which primarily affect the left ventricle (LV) and right ventricle (RV), respectively, and functional impairment compared with control populations is found in both cases. The limitations of this implementation are then used to guide development of an automated algorithm for the same purpose, making use of fully convolutional neural networks (CNN) for segmentation and spline registration across all frames simultaneously for tracking. This study is performed in the subjects with HCM, and functional impairment is again identified in disease subjects. Finally, as myocardial contraction is inherently a 3-D phenomenon, a technique is developed for quantification of regional function from 3-D+Time functional CCT studies using simultaneous registration of automatically generated Loop subdivision surface models for tracking. This study is performed in canine mongrels, and compared with the current state of the art technique for CCT functional analysis. This work demonstrates the feasibility of automated, reproducible cardiac functional analysis from CMR and CCT image sequences. While work remains to be done in extending the principles demonstrated and modular components described to fully automated whole-heart analysis, it is hoped that this thesis will accelerate the clinical adoption of regional functional analysis.
512

Understanding fire histories : the importance of charcoal morphology

Crawford, Alastair James January 2015 (has links)
Quantifying charcoal particles preserved in sedimentary environments is an established method for estimating levels of fire activity in the past, both on human and geological timescales. It has been proposed that the morphology of these particles is also a valuable source of information, for example allowing inferences about the nature of the vegetation burned. This thesis aims to broaden the theoretical basis for these methods, and to integrate morphometric study of sedimentary charcoal with its quantification. Three key questions are addressed: firstly, whether the elongation of mesocharcoal particles is a useful indicator of fuel type; secondly, whether different sedimentary archives tend to preserve different charcoal morphologies; and finally, the critical question of how morphology affects charcoal quantification. The results corroborate the idea that grasses and trees produce mesocharcoal with distinctly different aspect ratios. However, the application of this as an indicator of vegetation change is complicated by the inclusion of species which are neither grasses nor trees, and by considerations of the effects of transportation. Charcoal morphotypes in diverse sedimentary environments are shown to be influenced by vegetation types, transportation history, and nature of the fire that produced them. Previous research has treated charcoal quantification and charcoal morphology as separate issues. Here it is shown that understanding morphology is essential for the accurate quantification of charcoal, since it affects the relationship between volumes and the two-dimensional areas from which measurements are taken. Understanding this relationship could allow such measurements to be used not just as relative measures of past fire activity, but to enable the accurate quantification of the charcoal sequestered in soils and sediments. This has important implications for our ability to understand the effects of fire on carbon cycling, and the role that fire plays in the Earth system.
513

OCR of dot peen markings : with deep learning and image analysis

Edvartsen, Hannes January 2018 (has links)
A way to follow products through the chain of production is important in the process industry and it is often solved by marking them with serial numbers. In some cases permanent markings such as dot peen marking is required. To ensure profitability in the industry and reduce errors, these markings must be read automatically. Automatic reading of dot peen markings using a camera can be hard since there is low contrast between the background and the numbers, the background can be uneven and different illuminations can affect the visibility. In this work, two different systems are implemented and evaluated to assess the possibility of developing a robust system. One system uses image analysis to segment the numbers before classifying them. The other system uses the recent advances in deep learning for object detection. Both implementations are shown to work in near real-time on a cpu. The deep learning object detection approach was able to classify all numbers correct in a image 60% of the time, while the other approach only succeeded in 20% of the time.
514

Digital image analysis for tumor cellularity and gleason grade to tumor volume analysis in prostate cancer

Chaniotakis, Sotiris 11 July 2018 (has links)
PURPOSE: This study was undertaken to compare HALO™ software image analysis measurements of cellularity with visual estimations from the pathologist and to outline a protocol for future experimental determinations of cellularity using HALO™. Secondly, this study investigated the clinically challenging prostate cancers of Gleason score 7 by analyzing a large database of radical prostatectomy (RP) specimens with regard to their Gleason grade composition and percentage tumor volume composition. The importance of these values of tumor cellularity, prostate volume, and tumor volume data were discussed in terms of future diagnostic endeavors. Finally, this study provided a brief background on prostate cancer, prostate cancer epidemiology, digital pathology, and the limitations and difficulties in the technological transition to digital pathology. All work for this study was done at Dana-Farber Cancer Institute (Boston, MA). METHODS: In the first part of this study, histological slides were acquired by radical prostatectomy (RP) and contained 12 tumor foci of varying degrees and sizes. These slides were scanned and imported into the HALO™ image analysis software. The tumor foci, previously demarcated by a pathologist, were annotated by hand in HALO™. An algorithm for image analysis was created by training classifiers to recognize and differentiate between epithelial tissue, stromal tissue, glass, and other. This process was accomplished by classifying 62 regions which were tested for accuracy before becoming the components of an algorithm to analyze the entire annotation layer. Each tumor focus was analyzed individually, and the results were exported into Microsoft® Excel from which relevant data were extracted. Cellularity was calculated by the percentage of tumor area that the algorithm characterized as epithelial. Cellularity values derived from HALO™ measurements for each tumor focus were compared with the visual estimations of cellularity provided by the pathologist using Pearson's correlation analysis. In the second part of this study, a database of 1386 slides containing tumors with Gleason scores between 6 and 9 was compiled from 140 RP cases. The average percentages of Gleason grades 3, 4, and 5 in each case were determined. The percentage of each slide that was occupied by the tumor was also averaged for each case, yielding an average percentage of tumor volume for each case. The average Gleason grade 3, 4, or 5 percentage for each case was plotted against the associated average tumor volume percentage of that case. The cases of Gleason score 7 (3+4, 4+3) were then isolated and plotted in a similar manner. Pearson’s correlation analysis was used to determine the degree of linear correlation between the two variables in each plot. Results: In the first part of this study, a statistically significant positive correlation between the cellularity estimations of the pathologist and the HALO™ cellularity measurements was found (r = 0.92, p < 0.01, n =12). In the second part of this study, there was a statistically significant negative correlation between average Gleason grade 3 percentage per case and average tumor volume percentage per case (r = -0.55, p <0.001, n = 140). There was also a statistically significant positive correlation between average Gleason grade 4 percentage per case and average tumor volume percentage per case (r = 0.55, p <0.001, n = 140). After slides containing Gleason score 6 (3+3) tumor were removed from the data, a statistically significant negative correlation remained between average Gleason grade 3 percentage per case and average tumor volume percentage per case (r = -0.51, p <0.001, n = 78), and a statistically significant positive correlation remained between average Gleason grade 4 percentage per case and average tumor volume percentage per case (r = 0.5, p <0.001, n = 101). A statistically significant relationship between average Gleason grade 5 percentage and average tumor volume percentage was not found (r = 0.32, p = 0.14, n = 23). CONCLUSIONS: In the first part of this study, the strong positive correlation between HALO™ cellularity values and visual estimations by the pathologist suggests that image analysis may be an effective tool for determining cellularity in digital histological images. More research using larger sample sizes is recommended to further validate the correlation between algorithm-derived cellularity from HALO™ and visual estimation by the pathologist. In the second part of this study, it appears that the volume of prostate tumors of Gleason score 7 may have prognostic power, considering that an increased percentage composition of Gleason grade 4 correlated with larger tumor volumes. Because this result may have significant clinical implications, further research specifically on tumors of Gleason score 7 is suggested to verify this relationship.
515

Multiscale analysis of cohesive fluidization

Umoh, Utibe Godwin January 2018 (has links)
Fluidization of a granular assembly of solid particles is a process where particles are suspended in a fluid by the upward flow of fluid through the bed. This process is important in industry as it has a wide range of applications due to the high mixing and mass transfer rates present as a result of the rapid movement of particles which occurs in the bed. The dynamics of fluidization is heavily dependent on the particle scale physics and the forces acting at a particle level. For particles with sizes and densities less than 100μm and 103 kg/m3, the importance of interparticle forces such as cohesion to the fluidization phenomena observed increases compared to larger particles where phenomena observed are more dependent on hydrodynamic forces. These smaller sized particles are increasingly in high demand in industrial processes due to the increasing surface area per unit volume obtained by decreasing the particle size. Decreasing particle however leads to an increase in the impact of cohesive interparticle forces present between particles thus altering fluidization phenomena. It is thus necessary to get a greater understanding of how these cohesive forces alter fluidization behaviour both at the particle and also at the bulk scale. This work begins with an experimental study of a fluidized bed using high speed imaging. The applicability of particle image velocimetry for a dense bed is examined with verification and validation studies showing that particle image velocimetry is able to accurately capture averaged velocity profiles for particles at the front wall. A digital image analysis algorithm which is capable of accurately extracting particle solid fraction data for a dense bed at non-optimum lighting conditions was also developed. Together both experimental techniques were used to extract averaged particle mass flux data capable of accurately capturing and probing fluidization phenomena for a dense fluidized bed. This simulation studies carried out for this work looks to examine the impact of cohesive forces introduced using a van der waal cohesion model on phenomena observed at different length scales using DEM-CFD simulations. Numerical simulations were run for Geldart A sized particles at different cohesion levels represented by the bond number and at different inlet gas velocities encompassing the different regimes fluidization regimes present. A stress analysis was used to examine the mechanical state of the expanded bed at different cohesion levels with the vertical component of the total stress showing negative tensile stresses observed at the center of the bed. Further analysis of the contact and cohesive components of the stress together with a kcore and microstructural analysis focusing on the solid fraction and coordination number profiles indicated that this negative total stress was caused by a decrease in the contact stress due to breakage of mechanical contacts as cohesive forces are introduced and increased. A pressure overshoot analysis was also conducted with the magnitude of the overshoot in pressure seen during the pressure drop analysis of a cohesive bed shown to be of equivalent magnitude to the gradient of the total negative stress profile. The in-homogeneous nature of the bed was probed with the focus on how introducing cohesion levels increase the degree of inhomogeneity present in the expanded bed and how local mesoscopic structures change with cohesion and gas velocity. It was shown that increasing cohesion increases the degree of inhomogeneity in the bed as well as increasing the degree of clustering between particles. A majority of particles were shown to be present in a single macroscopic cluster in the mechanical network with distinct local mesoscopic structures forming within the macroscopic cluster. The cohesive bed also expanded as distinct dense regions with low mechanical contact zones in between these regions. A macroscopic cluster analysis showed that the majority of particles are in strong enduring mechanical and cohesive contact. Increasing cohesive forces were also shown to not only create a cohesive support network around the mechanical network but also strengthen the mechanical contact network as well. The significance of the strong and weak mechanical and cohesive forces on fluidization phenomena was also examined with analysis showing that the weak mechanical forces act to support the weak mechanical forces. The cohesive force network however was non coherent with strong forces significantly greater than weak forces. Fluidization phenomena was shown to be driven by the magnitude of the strong cohesive forces set by the minimum particle cutoff distance. This also called into question the significance of the cohesive coordination number which is dependent on the maximum cohesive cutoff. The value of the maximum cutoff was shown to be less significant as no significant changes were observed in the stress and microstructure data as the maximum cutoff was altered. Simulations with different ratios of cohesive and non cohesive particles were also undertaken and showed that a disruption in the cohesive force network leads to changes in the stress state and microstructure of the bed thus changing the fluidization phenomena observed at all length scales. The nature of the strong cohesive force network thus drives fluidization phenomena seen in the bed.
516

Micro-particle Streak Velocimetry - Theory, Simulation Methods and Applications.

January 2011 (has links)
abstract: This dissertation describes a novel, low cost strategy of using particle streak (track) images for accurate micro-channel velocity field mapping. It is shown that 2-dimensional, 2-component fields can be efficiently obtained using the spatial variation of particle track lengths in micro-channels. The velocity field is a critical performance feature of many microfluidic devices. Since it is often the case that un-modeled micro-scale physics frustrates principled design methodologies, particle based velocity field estimation is an essential design and validation tool. Current technologies that achieve this goal use particle constellation correlation strategies and rely heavily on costly, high-speed imaging hardware. The proposed image/ video processing based method achieves comparable accuracy for fraction of the cost. In the context of micro-channel velocimetry, the usability of particle streaks has been poorly studied so far. Their use has remained restricted mostly to bulk flow measurements and occasional ad-hoc uses in microfluidics. A second look at the usability of particle streak lengths in this work reveals that they can be efficiently used, after approximately 15 years from their first use for micro-channel velocimetry. Particle tracks in steady, smooth microfluidic flows is mathematically modeled and a framework for using experimentally observed particle track lengths for local velocity field estimation is introduced here, followed by algorithm implementation and quantitative verification. Further, experimental considerations and image processing techniques that can facilitate the proposed methods are also discussed in this dissertation. Unavailability of benchmarked particle track image data motivated the implementation of a simulation framework with the capability to generate exposure time controlled particle track image sequence for velocity vector fields. This dissertation also describes this work and shows that arbitrary velocity fields designed in computational fluid dynamics software tools can be used to obtain such images. Apart from aiding gold-standard data generation, such images would find use for quick microfluidic flow field visualization and help improve device designs. / Dissertation/Thesis / Ph.D. Electrical Engineering 2011
517

Desenvolvimento de plântulas de soja em função da dessecação das plantas e do tratamento das sementes /

Toledo, Mariana Zampar, 1982- January 2011 (has links)
Orientador: Cláudio Cavariani / Banca: Edvaldo Aparecido Amaral da Silva / Banca: Maria Laene Moreira de Carvalho / Banca: José de Barros França Neto / Barnca: Ana Dionisia da Luz Coelho Novembre / Resumo: Esta pesquisa foi realizada com o objetivo de avaliar a germinação das sementes e o desenvolvimento de plântulas de soja em função do tratamento químico das sementes, de diferentes níveis de vigor inicial, com fungicidas, inseticidas, micronutrientes e bioestimulante, produzidas com e sem dessecação das plantas em pré-colheita com glyphosate. A primeira etapa da pesquisa constou da produção de sementes da cultivar Conquista com diferentes níveis de vigor inicial, com e sem dessecação das plantas. O delineamento experimental foi blocos ao acaso, com quatro repetições, em esquema de parcelas subsubdivididas. As parcelas principais foram definidas por duas épocas de semeadura (novembro e dezembro de 2008) e as subparcelas pela realização ou não da dessecação das plantas em pré-colheita com glyphosate na dose de 2,0 L ha-1, no estágio R7. As sementes foram colhidas em três etapas para cada uma das combinações de tratamentos no campo, quais sejam 1, 7 e 14 dias após a maturidade fisiológica, constituindo as subsubparcelas. Na etapa seguinte avaliou-se a qualidade das sementes colhidas, mediante testes de germinação e de vigor, para a determinação dos diferentes níveis de vigor. As sementes foram, então, tratadas com fungicidas, inseticidas, micronutrientes e um bioestimulante para posterior avaliação da germinação e do desenvolvimento das plântulas. Por fim, as sementes armazenadas, após o tratamento químico, foram avaliadas quanto às características das plântulas por meio de análise computadorizada de imagens (Seed Vigor Imaging System - SVIS®). Para a análise estatística, o delineamento experimental utilizado foi o inteiramente casualizado, com quatro repetições. Os dados de caracterização da qualidade das sementes após a colheita foram submetidos à análise de variância e as médias comparadas pelo teste de Tukey (p 0,05) ... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: This work had the objective of evaluating soybean germination and seedling development affected by chemical treatment of seeds, with different vigor levels, with fungicides, insecticides, micronutrients and biostimulant, produced with and without preharvest desiccation with glyphosate. The first stage of this research was the production of soybean seeds, cultivar Conquista, with different levels of initial vigor, both with and without plant desiccation. Main plots consisted of two sowing times (November and December of 2008) combined with and without pre-harvest desiccation of plants with glyphosate at the dose 2,0 L ha-1, at R7, which were the subplots. Seeds of each subplot were harvested at three times, i.e., 1, 7 and 14 days after physiological maturity. Afterwards, seed quality was evaluated by germination and vigor tests to determine vigor levels. Then, seeds were treated with fungicides, insecticides, micronutrients and a biostimulant to evaluate germination and seedling development. Finally, the characeristics of seedlings produced by the stored seeds, treated and non-treated, were evaluated by an image system (Seed Vigor Imaging System - SVIS®), at The Ohio State University, Department of Horticulture and Crop Science, Columbus, Ohio, U.S.A. For statistical analysis, the experimental design was the completely randomized, with four replications. Seed quality data evaluated right after harvest was submitted to variance analysis and means were compared by the Tukey test (p 0.05), separately for each desiccation treatment. After chemical treatment, means were compared by the Tukey test (p 0,05) and analyzed as a factorial 2 x 6 x 8 (with and without pre-harvest desiccation x vigor levels x product) for fungicides, 2 x 6 x 2 for insecticides and 2 x 6 x 3 for micronutrients and the biostimulant. Dunnett test (p 0,05) was applied to compare each mean value with the control, without ... (Complete abstract click electronic access below) / Doutor
518

Transferts d'apparence en espace image basés sur des propriétés texturelles / Image-space appearance transfers based on textural properties

Arbelot, Benoit 07 April 2017 (has links)
Les techniques de manipulation de l'apparence d'une image sont très utilisées dans de nombreux domaines comme la photographie, la biologie, l'astronomie ou le spectacle vivant. L'apparence d'une image dépend de ses couleurs et textures, mais aussi des informations 3D perçues comme les formes, les matériaux et l'éclairage. Ces caractéristiques créent aussi une atmosphère et une émotion spécifique à l'image, qui fait aussi partie de son apparence. Le but des techniques de manipulation en espace image est de modifier les couleurs et les textures pour modifier les formes, matériaux et illumination perçus, et de là éventuellement modifier l'atmosphère de l'image.Les méthodes de transfert d'apparence sont une classe spécifique de techniques de manipulation visant à rendre le processus plus intuitif en calculant automatiquement la modification à apporter à l'image. Pour ce faire, elles se basent sur une image supplémentaire fournie par l'utilisateur et représentant l'apparence désirée.Dans cette thèse, nous étudions le transfert d'apparence en espace image basé sur des propriétés texturelles. Les textures étant une partie intégrante de l'apparence, guider les transferts d'apparence par de l'information texturelle est une option attrayante. Nous présentons d'abord une technique de filtrage de descripteurs de texture pour mieux préserver les structures de l'image et les transitions entre textures. Nous utilisons ensuite ce filtrage couplé à différents descripteurs de texture pour effectuer un transfert de couleur, une colorisation et un transfert de texture localement, guidés par la texture. / Image-space appearance manipulation techniques are widely used in various domains such as photography, biology, astronomy or performing arts. An image appearance depends on the image colors and texture, but also the perceived 3D informations such as shapes, materials and illumination. These characteristics also create a specific look and feel for the image, which is also part of the image appearance. The goal of image-space manipulation techniques is to modify colors and textures as a mean to alter perceived shapes, illumination, materials, and from this possibly alter the image look and feel.Appearance transfer methods are a specific type of manipulation techniques aiming to make the process more intuitive by automatically computing the image modification. In order to do so, they use an additional user-provided image depicting the desired appearance.In this thesis, we study image-space appearance transfer based on textural properties. Since textures are an integral part of the image appearance, guiding appearance transfers with textural information is an attractive approach. We first present a texture descriptor filtering framework to better preserve image edges and texture transitions in the texture analysis. We then use this framework coupled with different texture descriptors in order to apply local texture guided color transfer, colorization and texture transfer.
519

Analyse d'images 3D par méthodes variationnelles et ondelettes : application à l'imagerie médicale / 3D image analysis with variational methods and wavelets : applications to medical image processing

Tran, Minh-Phuong 28 September 2012 (has links)
L’imagerie médicale joue un rôle de plus en plus important avec le développement de nombreuses techniques d’acquisition. Il faut principalement pouvoir restaurer (débruiter) les images et en faire une segmentation. Ainsi toute l’information qualitative et quantitative sera disponible pour affiner les diagnostics. Dans cette thèse nous proposons une contribution à cette analyse dans un contexte 3D. Nous étudions deux grands types de méthodes : les méthodes variationnelles et les méthodes par ondelettes. Nous commençons par présenter les modèles variationnels du second ordre, qui s’avèrent plus performants que la classique méthode du premier ordre de Rudin-Osher-Fatemi. Nous l’utilisons pour débruiter et segmenter après avoir donné un bref état de l’art des procédés d’acquisition des images en médecine. Nous introduisons ensuite la transformée en ondelettes et présentons des algorithmes basés sur cette méthode. Les résultats numériques montrent que ces méthodes sont performantes et compétitives. Le coeur de notre travail est de développer des rerésentations 3D qui sont bien adaptées à des données médicales complexes comme des images IRM sous échantillonnées, peu contrastées (cervelets de souris) ou des images IRM d’angiographie (cerveaux de souris). Chaque technique a ses avantages et ses inconvénients. Aussi nous proposons un modèle variationnel mixte second ordre / seuillage par ondelettes. Ce modèle se comporte particulièrement bien : le bruit est correctement éliminé et les contours et textures préservés. Pour finir, nous adaptons plusieurs méthodes de fermeture de contours (hystérésis et distance de chanfrein) dans un contexte 3D. Le mémoire se termine par une synthèses des résultats et une présentation de futures directions de recherche. / Medical procedures have become a critical application area that makes substantial use of image processing. Medical image processing tasks mainly deal with image restoration, image segmentation that bring out medical image details, measure quantitatively medical conditions etc. The diagnosis of a health problem is now highly dependent on the quality and the credibility of the image analysis. The practical contributions of this thesis can be considered in many directions for medical domain. This manuscript addresses a 3D image analysis with variational methods and wavelet transform in the context of medical image processing. We first survey the second-order variational minimization model, which was proved that better than the classical Rudin-Osher-Fatemi model. This method is considered in problems associated to image denoising, image segmentation, that makes a short state of the art on medical imaging processing techniques. Then we introduce the concept of wavelet transform and present some algorithms that also used in this domain. Experimental results show that these tools are very useful and competitive. The core of this research is the development of new 3D representations, which are well adapted to representing complicated medical data, and filament structures in 3D volumes: the cerebellum and mice vessels network. Each of these two based methods has advantages and disadvantages, we then propose a new modified model that combines these schemes in the rest of the thesis. In this situation we propose a new modified model that combines these schemes. With the new decomposition model, in the reconstructed image, noise can be removed successfully and contours, textures are well preserved. This leads to further improvements in denoising performance. Finally, the further part of the thesis is devoted to the description of contribution to extend some classical contour closing methods, namely hysteresis thresholding and contour closing based on chamfer distance transform, in the 3D context. The thesis concludes with a review of our main results and with a discussion of a few of many open problems and promising directions for further research and application.
520

Roles of the two chemotaxis clusters in Rhodobacter sphaeroides

de Beyer, Jennifer Anne January 2013 (has links)
Bacteria swim towards improving conditions by controlling flagellar activity via signals (CheY) sent from chemosensory protein clusters, which respond to changing stimuli. The best studied chemotactic bacterium, E. coli, has one transmembrane chemosensory protein cluster controlling flagellar behaviour. R. sphaeroides has two clusters, one transmembrane and one cytoplasmic. The roles of the two clusters in regulating swimming and chemosensory behaviour are explored here. Newly-developed software was used to measure the effect of deleting or mutating each chemotaxis protein on unstimulated swimming and on the chemosensory response to dynamic change. New behaviours were identified by using much larger sample sizes than previous studies. R. sphaeroides chemotaxis mutants were classified as (i) stoppy unresponsive; (ii) smooth unresponsive or (iii) stoppy inhibited compared to wildtype swimming and chemosensory behaviour. The data showed that the ability to stop during free-swimming is not necessarily connected to the ability to respond to a chemotaxis challenge. The data suggested a new model of connectivity between the two chemosensory pathways. CheY<sub>3</sub> and CheY<sub>4</sub> are phosphorylated by the transmembrane polar cluster in response to external chemoeffector concentrations. CheY<sub>6</sub>-P produced by the cytoplasmic cluster is a requirement for chemotaxis, whether or not the polar cluster is able to produce CheY<sub>6</sub>-P. CheY<sub>6</sub>-P stops the motor, whereas CheY<sub>3,4</sub>-P allow smooth swimming. When chemoeffector levels fall, the signals through CheY<sub>3,4</sub> fall, allowing CheY<sub>6</sub>-P to bind and stop the motor. As the polar cluster adapts to the fall by the action of the adaptation proteins CheB<sub>1</sub> and CheR<sub>2</sub>, the concentration of CheY<sub>3,4</sub>-P increases again, to compete with CheY<sub>6</sub>-P and allow periods of smooth swimming. Under aerobic conditions, the cytoplasmic cluster controls the basal stopping frequency and does not appear to respond to external chemoeffector changes. The role of the adaptation proteins in resetting the signalling state in R. sphaeroides is unclear, particularly the roles of the proteins associated with the cytoplasmic cluster, CheB<sub>2</sub> and CheR<sub>3</sub>. Tandem mass spectrometry was used to identify glutamate and glutamine (EQ) sites on the cytoplasmic R. sphaeroides chemoreceptor TlpT that are deamidated and methylated by the R. sphaeroides adaptation homologues. In E. coli, adaptation sites are usually EQ/EQ pairs. However the sites reported in TlpT vary at the first residue in the pair. Mutation of the putative EQ adaptation sites caused changes in adaptation, suggesting that CheY<sub>6</sub>-P levels are controlled and reset by CheB<sub>2</sub> and CheR<sub>3</sub> controlling the adaptation state of TlpT.

Page generated in 0.4498 seconds