Spelling suggestions: "subject:"3methods devevelopment"" "subject:"3methods agentdevelopment""
1 |
Effect of pressure on porous materialsMcMonagle, Charles James January 2018 (has links)
Research to design and synthesise new porous materials is a rapidly growing field with thousands of new systems proposed every year due to their potential use in a multitude of application in a wide range of fields. Pressure is a powerful tool for the characterisation of structure-property relationships in these materials, the understanding of which is key to unlocking their full potential. In this thesis we investigate a range of porous materials at a range of pressures. Over time the chemical architecture and complexity of porous materials has increased. Although some systems display remarkable stability to high-pressures, which we generally think of as being above 1 GPa (10,000 bar), in general, the compressibility of porous materials have increased substantially over the last 10 years, rendering most unstable at GPa pressures. Here we present new methods for investigating porous materials at much more moderate pressures (100's of bar), alongside more traditional high-pressure methods (diamond anvil cell techniques), finishing with gas sorption studies in a molecular based porous material. Here, the design and development of a new moderate pressure sapphire capillary cell for the small molecule beamline I19 at the Diamond Light Source is described. This cell allowed access to pressures of more than 1000 bar regularly with a maximum operating pressure of 1500 bar with very precise pressure control (< 10 bar) on both increasing and decreasing pressure. This cell closes the gap between ambient pressure and the lowest pressures attainable using a diamond anvil cell (DAC), which is generally above 0.2 GPa (2000 bar). Along with the development of the sapphire capillary pressure cell, the compression to 1000 bar of the small organic sample molecule Hexamethylenetetramine (hexamine, C6H12N4) and its deuterated form (C6D12N4) was determined, demonstrating the precision possible using this cell. Solvent uptake into porous materials can induce large structural changes at 100's of bar. In the case of the Sc-based Metal-organic framework (MOF), Sc2BDC3 (BDC = 1,4-benzenedicarboxylate), we used the sapphire capillary pressure cell to study changes in the framework structure on the uptake of n-pentane and isopentane. This work shows how the shape and smaller size of n-pentane facilitated the swelling of the framework that could be used to explain the increase in stability of the MOF to applied pressure. The effect of pressure on the previously unreported Cu-framework bis[1-(4- pyridyl)butane-1,3-dione]copper(II) (CuPyr-I) was investigated using high-pressure single-crystal diffraction techniques (DAC). CuPyr-I was found to exhibit high-pressure and low-temperature phase transitions, a pressure induced Jahn- Teller switch (which was hydrostatic medium dependent), piezochromism, and negative linear compressibility. Although each of these phenomena has been reported numerous times in a range of materials, this is to the best of our knowledge the first example to have been observed within the same material. The final two chapters investigate the exceptional thermal, chemical, and mechanical stability of a porous molecular crystal system (PMC) prepared by the co-crystallisation of a cobalt phthalocyanine derivative and a fullerene (C 60 or C70). The stabilising fullerene is captured in the cavity between two phthalocyanines in a ball and socket arrangement. These PMCs retain their porous structure: on the evacuation of solvent of crystalisation; on heating to over 500 K; on prolonged immersion in boiling aqueous acid, base, and water; and at extreme pressures of up to 5.85 GPa, the first reported high-pressure study of a PMC. the reactive cobalt cation is accessible via the massive interconnected voids, (8 nm3), as demonstrated by the adsorption and binding of CO and O2 to the empty metal site using in situ crystallographic methods available at beamline I19, Diamond Light Source.
|
2 |
The metabolomics of chronic stressSobsey, Constance Ananta 26 April 2016 (has links)
The World Health Organization has called stress-related illness “the health epidemic of the 21st century.” While the biochemical pathways associated with the acute stress response are well-characterized, many of the pathways behave differently under conditions of chronic stress. The purpose of this project is to apply high-sensitivity mass spectrometry (MS)-based targeted and untargeted metabolomics approaches to generate new insights into the biochemical processes and pathways associated with the chronic stress response, and potential mechanisms by which chronic stress produces adverse health effects.
Chapter 1 describes the application of sets of targeted and untargeted metabolomics approaches to analyze serum samples from a human epigenetic model of chronic stress in order to identify potential targets for further analysis. To test the resulting hypothesis that oxidative stress is a key feature of chronic stress, a new targeted multiple reaction monitoring (MRM)-MS assay was developed for the accurate quantitation of aldehyde products of lipid peroxidation, as described in Chapter 2. In Chapter 3, the validated method for quantitation of malondialdehyde (MDA) was t applied to mouse plasma samples from a model of chronic social defeat stress to determine whether animals exposed to psychosocial stress show increases in oxidative stress. Mouse plasma samples from this model were also analyzed by untargeted metabolomics using Fourier-transform (FT)-MS to identify other important metabolite features, particularly those that overlap with metabolites identified in the human epigenetic model.
Analysis of metabolomic data from two very different models of chronic stress supports the consistent detection of a metabolomic phenotype for chronic stress that is characterized by the dysregulation of energy metabolism associated with decreased concentrations of diacyl-phospholipids in blood. Increased blood concentrations of fatty acids, carnitines, acylcarnitines, and ether phospholipids were also observed. In addition to metabolites associated with energy metabolism, chronic stress also significantly influenced metabolites associated with amino acid metabolism and cell death. This characteristic pattern of differences in metabolite concentrations was observed in the plasma of mice exposed to chronic social defeat stress, irrespective of whether or not they displayed outward signs of a chronic stress response; In fact, mice that were “resilient” to the behavioural effects of chronic social defeat stress displayed an exaggerated phenotype over mice that showed depressive-like symptoms following chronic stress exposure. This may suggest that the observed changes in fatty acid composition are protective against stress. However, changes in fatty acid composition are also known to be associated with a wide variety of pathologies including heart disease, neurodegenerative diseases, and mood disorders, so the lipidomic changes associated with chronic stress may also contribute to its health impact. Overall, the results provide further evidence that changes in energy metabolism are a central part of allostatic adaptation to chronic stress. / Graduate / 0487 / csobsey@gmail.com
|
3 |
Inorganic polyphosphate in the marine environment: field observations and new analytical techniquesDiaz, Julia M. 31 March 2011 (has links)
Phosphorus (P) is a requirement for biological growth, but this vital nutrient is present at low or limiting concentrations across vast areas of the global surface ocean. Inorganic polyphosphate (poly-P), a linear polymer of at least three orthophosphate units, is one component of the marine P cycle that has been relatively overlooked as compared to other P species, owing in part to a lack of routine analytical techniques that cleanly evaluate it within samples. This thesis demonstrates that inorganic poly-P is a quantitatively significant and dynamic component of the global marine P cycle while also establishing two new techniques for its analysis in biological and environmental samples. In Chapter 2, experiments using the freshwater algae Chlamydomonas sp. and Chlorella sp. illustrate X-ray fluorescence spectromicroscopy as a powerful tool for the sub-micron scale assessment of poly-P composition in organisms. This method enabled the discovery, detailed in Chapter 3, of a mechanism for the long-term sequestration of the vital nutrient P from marine systems via the initial formation of poly-P in surface waters and its eventual transformation into the mineral apatite within sediments. The importance of marine poly-P is furthermore established in Chapter 3 by observations showing that naturally-occurring poly-P represents 7-11% of total P in particles and dissolved matter in Effingham Inlet, a eutrophic fjord located on Vancouver Island, British Columbia. In Chapter 4, a new fluorometric protocol based on the interaction of inorganic poly-P with 4',6-diamidino-2-phenylindole (DAPI) is established as a technique for the direct quantification of poly-P in environmental samples. Chapter 5 presents work from Effingham Inlet utilizing this method that show that inorganic poly-P plays a significant role in the redox-sensitive cycling of P in natural systems.
|
4 |
Implementace podnikového informačního systému: teorie a praxe / Implementation of business information system: Theory and PracticeJonáš, Pavel January 2010 (has links)
The work try to familiarize the users with information systems in contemporary business and show the greates weakness in the implementation of information systems in enterprises. The issue of implementation is addressed in the work from all sides and angles. First, the implementation process itself analyzed and its parts. The following are general methods applicable to software development and evaluated their use in the implementation process. In addition to the implementation process is part of the work given to the management. In the work is discusses how to look at the company's management can influence the implementation process. Another problem to be solved are the key factors influencing the course and success of implementation. In addition of these factors are mentioned the most important risks affecting various stages of implementation. Based on these risks are created by three specific scenarios which could arise in individual cases of emergency. Gradually, readers are drawn into the issue and during the actual implementation in real practice. The individual parts are reminded of the most significant weaknesses or places that none of the parties should not be underestimated. In addition to the actual weaknesses of the implementation process are discussed and two basic Casework across the entire spectrum of companies. At the conclusion of the implementation process is displayed in a graphic sketch, where the individual problems of implementation are shown in the very way the implementation process. The reader can better imagine the implementation process, including the sequence of steps. Complete implementation issue is addressed in the work of all views of interested parties. So your company owner will find thinking about the implementation, or project manager who deals with the issue. The work is based on the author's years of experience in information systems implementations themselves.
|
5 |
Reconstruction and Local Recovery of Data from Synchronization ErrorsMinshen Zhu (15334783) 21 April 2023 (has links)
<p>In this thesis we study the complexity of data recovery from synchronization errors, namely insertion and deletion (insdel) errors.</p>
<p>Insdel Locally Decodable Codes (Insdel LDCs) are error-correcting codes that admit super-efficient decoding algorithms even in the presence of many insdel errors. The study of such codes for Hamming errors have spanned several decades, whereas work on the insdel analogue had amounted to only a few papers before our work. This work initiates a systematic study of insdel LDCs, seeking to bridge this gap through designing codes and proving limitations. Our upper bounds essentially match those for Hamming LDCs in important ranges of parameters, even though insdel LDCs are more general than Hamming LDCs. Our main results are lower bounds that are exponentially stronger than the ones inherited from the Hamming LDCs. These results also have implications for the well-studied variant of relaxed LDCs. For this variant, besides showing the first results in the insdel setting, we also answer an open question for the Hamming variant by showing a strong lower bound.</p>
<p>In the trace reconstruction problem, the goal is to recover an unknown source string x \in {0,1}n from random traces, which are obtained by hitting the source string with random deletion/insertions at a fixed rate. Mean-based algorithms are a class of reconstruction algorithms whose outputs depend only on the empirical estimates of individual bits. The number of traces needed for mean-based trace reconstruction has already been settled. We further study the performance of mean-based algorithms in a scenario where one wants to distinguish between two source strings parameterized by their edit distance, and we also provide explicit construction of strings that are hard to distinguish. We further establish an equivalence to the Prouhet-Tarry-Escott problem from number theory, which ends up being an obstacle to constructing explicit hard instances against mean-based algorithms.</p>
|
6 |
Flexible and Data-Driven Modeling of 3D Protein Complex StructuresCharles W Christoffer (17482395) 30 November 2023 (has links)
<p dir="ltr">Proteins and their interactions with each other, with nucleic acids, and with other molecules are foundational to all known forms of life. The three-dimensional structures of these interactions are an essential component of a comprehensive understanding of how they function. Molecular-biological hypothesis formulation and rational drug design are both often predicated on a particular structure model of the molecule or complex of interest. While experimental methods capable of determining atomic-detail structures of molecules and complexes exist, such as the popular X-ray crystallography and cryo-electron microscopy, these methods require both laborious sample preparation and expensive instruments with limited throughput. Computational methods of predicting complex structures are therefore desirable if they can enable cheap, high-throughput virtual screening of the space of biological hypotheses. Many common biomolecular contexts have largely been blind spots for predictive modeling of complex structures. In this direction, docking methods are proposed to address extreme conformational change, nonuniform environments, and distance-geometric priors. Flex-LZerD deforms a flexible protein using a novel fitting procedure based on iterated normal mode decomposition and was shown to construct accurate complex models even when an initial input subunit structure exhibits extreme conformational differences from its bound state. Mem-LZerD efficiently constrains the docking search space by augmenting the geometric hashing data structure at the core of the LZerD algorithm and enabled membrane protein complexes to be efficiently and accurately modeled. Finally, atomic distance-based approaches developed during modeling competitions and collaborations with wet lab biologists were shown to effectively integrate domain knowledge into complex modeling pipelines.</p>
|
7 |
HIGHLY ACCURATE MACROMOLECULAR STRUCTURE COMPLEX DETECTION, DETERMINATION AND EVALUATION BY DEEP LEARNINGXiao Wang (17405185) 17 November 2023 (has links)
<p dir="ltr">In life sciences, the determination of macromolecular structures and their functions, particularly proteins and protein complexes, is of paramount importance, as these molecules play critical roles within cells. The specific physical interactions of macromolecules govern molecular and cellular functions, making the 3D structure elucidation of these entities essential for comprehending the mechanisms underlying life processes, diseases, and drug discovery. Cryo-electron microscopy (cryo-EM) has emerged as a promising experimental technique for obtaining 3D macromolecular structures. In the course of my research, I proposed CryoREAD, an innovative AI-based method for <i>de nov</i>o DNA/RNA structure modeling. This novel approach represents the first fully automated solution for DNA/RNA structure modeling from cryo-EM maps at near-atomic resolution. However, as the resolution decreases, structure modeling becomes significantly more challenging. To address this challenge, I introduced Emap2sec+, a 3D deep convolutional neural network designed to identify protein secondary structures, RNA, and DNA information from cryo-EM maps at intermediate resolutions ranging from 5-10 Å. Additionally, I presented Alpha-EM-Multimer, a groundbreaking method for automatically building full protein complexes from cryo-EM maps at intermediate resolution. Alpha-EM-Multimer employs a diffusion model to trace the protein backbone and subsequently fits the AlphaFold predicted single-chain structure to construct the complete protein complex. Notably, this method stands as the first to enable the modeling of protein complexes with more than 10,000 residues for cryo-EM maps at intermediate resolution, achieving an average TM-Score of predicted protein complexes above 0.8, which closely approximates the native structure. Furthermore, I addressed the recognition of local structural errors in predicted and experimental protein structures by proposing DAQ, an evaluation approach for experimental protein structure quality that utilizes detection probabilities derived from cryo-EM maps via a pretrained multi-task neural network. In the pursuit of evaluating protein complexes generated through computational methods, I developed GNN-DOVE and DOVE, leveraging convolutional neural networks and graph neural networks to assess the accuracy of predicted protein complex structures. These advancements in cryo-EM-based structural modeling and evaluation methodologies hold significant promise for advancing our understanding of complex macromolecular systems and their biological implications.</p>
|
8 |
New Phages, New Insights: Diversity in Phage Research Leads To Impactful Phage Therapy OutcomesHarry Jack Ashbaugh (18858763) 22 June 2024 (has links)
<p dir="ltr">Bacteriophages are viruses that infect, replicate in, and kill bacteria. In industries that utilize microbes for production, like <i>E.coli</i> in the production of insulin or <i>A. globiformis</i> in the production of cheese, bacteriophages can pose a huge threat to manufacturing. However, bacteriophages aren’t entirely detrimental: we can use the destructive nature of bacteriophages to kill bacterial infections in the human body. This process is known as phage therapy, and while it isn’t a new concept, it is being seen as an increasingly necessary alternative to traditional antibiotics due to the increasing rise of antimicrobial resistance. Because bacteriophages have an entirely different mechanism of destroying bacteria, they can be used in tandem with traditional antibiotic regimens to help wipe out infections. Also, phages have a highly specific host range, meaning that an injection of a certain type of phage will only infect the bacteria it is targeting, sparing important gut microbes.</p><p dir="ltr">The search for new phages to treat infections has resulted in the discovery of over 25,000 actinobacteriophages, with about 4898 of them being sequenced. This is extremely important and necessary, but 49% of these sequenced phages are all mycobacteriophages. This bias towards mycobacteriophages is likely because they infect the genus mycobacterium, where the deadly <i>M. tuberculosis</i> resides. The discovery of new phages using less studied hosts results in novel phages that exhibit rarely seen morphologies, phenotypes, and genotypes. This leads to a better overall understanding of the phage proteome and can lead to new breakthroughs in phage therapy.</p><p dir="ltr">The purpose of this research is to study the differences between different types of phages and try to determine the impact it may have on phage therapy. This thesis is divided into three chapters. In the first chapter, novel phages from different hosts, including <i>M. smegmatis</i> and <i>A. globiformis</i>, were discovered and annotated, and the differences between them were characterized. The discovery of arthrobacteriophages immediately resulted in rare and previously unseen phage characteristics. In the second chapter, proteomic mass spectrometry data of various diverse mycobacteriophages was analyzed to determine differences. Despite being from multiple clusters and lifecycles, the expression data had more similarities than differences. In the third chapter, an alternative method of extracting DNA from phages is explored to determine the result of discrepancies in gel quality from <i>M. smegmatis</i> and <i>A. globiformis.</i><i> </i>Although a large amount of nucleic material was derived, it was not stable DNA and was unsuitable for use. The reason for poor gel quality is still unknown.</p>
|
9 |
Säg är det möjligt för studie- och yrkesvägledare att motverka traditionella könsmönster?Keynemo, Monica January 2011 (has links)
Säg är det möjligt att motverka traditionella könsmönster,trots strukturer som formar oss så att vi omedvetet styrs att väljautbildningar som leder till könstraditionella yrkesval? Att få kunskaper om ochge redskap för ett praktikorienterat jämställdhetsarbete i studie- ochyrkesvägledning är syftet med denna aktionsforskningsstudie. De vägledare som deltar har intresseför och kunskap om genusvetenskapliga perspektiv och kan ses som goda exempel. Vägledarnadeltar genom två intervjutillfällen och en månads fokusering på uppdraget attmotverka traditionella könsmönster. De använder olika metoder och berättar ompositiva, neutrala, obekväma, häftiga och negativa reaktioner från sökande. Resultatetbeskriver hur vägledarna ser på sitt uppdrag och hur de omsätter sina kunskaperi pedagogisk praktik och hur de bemöter och tolkar reaktioner som de får frånsökande. Under arbetets gång ökar den självinsikt som följer av att förstå hursvårt det är att inte göra kön. Metodutvecklinghar betydelse för att omvandla förhållningssätt till aktiv handling, men detavgörande är attityden till uppdraget. Slutsatser som dras är att uppdraget attmotverka traditionella könsmönster kan innebära att upptäcka dessa könsmönsteri vardagen, att vägra kategorisera utifrån kön och att inse att vi med hjälp avförändrade förväntningar tillsammans kan ändra det som ses som normalt. / Say is it possible to counteract traditional gender patterns, even if structures shape us so that we unconsciously are guided to choose courses that lead to gender-traditional career choices? To learn about and provide tools for a practice-oriented work with gender equality in educational and vocational guidance is the purpose of this action research. The counselors involved have an interest in and knowledge of gender perspectives and can be seen as good examples. They participate through two interview sessions and a monthly focus on the mission to counteract traditional gender patterns. They use different methods and reports of positive, neutral, awkward, violent and negative reactions from applicants. The results describe how counselors view their mission and how they apply their knowledge in pedagogical practice. It also shows how they respond to and interpret the reactions they receive from applicants. In the process they increase their self-awareness resulting from the understanding of how hard it is not to construct the gender stereotypes. Method development is important to transform attitudes to positive action, but what matters is the attitude towards the mission. Conclusions drawn are that the mission to counteract traditional gender patterns may mean to detect these gender patterns in everyday life, refuse to categorize on the basis of gender and realize that if we are changing expectations, together we can change what is seen as normal.
|
10 |
<b>Systems Modeling of host microbiome interactions in Inflammatory Bowel Diseases</b>Javier E Munoz (18431688) 24 April 2024 (has links)
<p dir="ltr">Crohn’s disease and ulcerative colitis are chronic inflammatory bowel diseases (IBD) with a rising global prevalence, influenced by clinical and demographics factors. The pathogenesis of IBD involves complex interactions between gut microbiome dysbiosis, epithelial cell barrier disruption, and immune hyperactivity, which are poorly understood. This necessitates the development of novel approaches to integrate and model multiple clinical and molecular data modalities from patients, animal models, and <i>in-vitro</i> systems to discover effective biomarkers for disease progression and drug response. As sequencing technologies advance, the amount of molecular and compositional data from paired measurements of host and microbiome systems is exploding. While it is become routine to generate such rich, deep datasets, tools for their interpretation lag behind. Here, I present a computational framework for integrative modeling of microbiome multi-omics data titled: Latent Interacting Variable Effects (LIVE) modeling. LIVE combines various types of microbiome multi-omics data using single-omic latent variables (LV) into a structured meta-model to determine the most predictive combinations of multi-omics features predicting an outcome, patient group, or phenotype. I implemented and tested LIVE using publicly available metagenomic and metabolomics data set from Crohn’s Disease (CD) and ulcerative colitis (UC) status patients in the PRISM and LLDeep cohorts. The findings show that LIVE reduced the number of features interactions from the original datasets for CD to tractable numbers and facilitated prioritization of biological associations between microbes, metabolites, enzymes, clinical variables, and a disease status outcome. LIVE modeling makes a distinct and complementary contribution to the current methods to integrate microbiome data to predict IBD status because of its flexibility to adapt to different types of microbiome multi-omics data, scalability for large and small cohort studies via reliance on latent variables and dimensionality reduction, and the intuitive interpretability of the meta-model integrating -omic data types.</p><p dir="ltr">A novel application of LIVE modeling framework was associated with sex-based differences in UC. Men are 20% more likely to develop this condition and 60% more likely to progress to colitis-associated cancer compared to women. A possible explanation for this observation is differences in estrogen signaling among men and women in which estrogen signaling may be protective against UC. Extracting causal insights into how gut microbes and metabolites regulate host estrogen receptor β (ERβ) signaling can facilitate the study of the gut microbiome’s effects on ERβ’s protective role against UC. Supervised LIVE models<b> </b>ERβ signaling using high-dimensional gut microbiome data by controlling clinical covariates such as: sex and disease status. LIVE models predicted an inhibitory effect on ER-UP and ER-DOWN signaling activities by pairs of gut microbiome features, generating a novel of catalog of metabolites, microbial species and their interactions, capable of modulating ER. Two strongly positively correlated gut microbiome features: <i>Ruminoccocus gnavus</i><i> </i>with acesulfame and <i>Eubacterium rectale</i><i> </i>with 4-Methylcatechol were prioritized as suppressors ER-UP and ER-DOWN signaling activities. An <i>in-vitro</i> experimental validation roadmap is proposed to study the synergistic relationships between metabolites and microbiota suppressors of ERβ signaling in the context of UC. Two i<i>n-vitro</i> systems, HT-29 female colon cancer cell and female epithelial gut organoids are described to evaluate the effect of gut microbiome on ERβ signaling. A detailed experimentation is described per each system including the selection of doses, treatments, metrics, potential interpretations and limitations. This experimental roadmap attempts to compare experimental conditions to study the inhibitory effects of gut microbiome on ERβ signaling and how it could elevate or reduce the risk of developing UC. The intuitive interpretability of the meta-model integrating -omic data types in conjunction with the presented experimental validation roadmap aim to transform an artificial intelligence-generated big data hypothesis into testable experimental predictions.</p>
|
Page generated in 0.0864 seconds