• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 20
  • 20
  • 9
  • 6
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Effect of pressure on porous materials

McMonagle, Charles James January 2018 (has links)
Research to design and synthesise new porous materials is a rapidly growing field with thousands of new systems proposed every year due to their potential use in a multitude of application in a wide range of fields. Pressure is a powerful tool for the characterisation of structure-property relationships in these materials, the understanding of which is key to unlocking their full potential. In this thesis we investigate a range of porous materials at a range of pressures. Over time the chemical architecture and complexity of porous materials has increased. Although some systems display remarkable stability to high-pressures, which we generally think of as being above 1 GPa (10,000 bar), in general, the compressibility of porous materials have increased substantially over the last 10 years, rendering most unstable at GPa pressures. Here we present new methods for investigating porous materials at much more moderate pressures (100's of bar), alongside more traditional high-pressure methods (diamond anvil cell techniques), finishing with gas sorption studies in a molecular based porous material. Here, the design and development of a new moderate pressure sapphire capillary cell for the small molecule beamline I19 at the Diamond Light Source is described. This cell allowed access to pressures of more than 1000 bar regularly with a maximum operating pressure of 1500 bar with very precise pressure control (< 10 bar) on both increasing and decreasing pressure. This cell closes the gap between ambient pressure and the lowest pressures attainable using a diamond anvil cell (DAC), which is generally above 0.2 GPa (2000 bar). Along with the development of the sapphire capillary pressure cell, the compression to 1000 bar of the small organic sample molecule Hexamethylenetetramine (hexamine, C6H12N4) and its deuterated form (C6D12N4) was determined, demonstrating the precision possible using this cell. Solvent uptake into porous materials can induce large structural changes at 100's of bar. In the case of the Sc-based Metal-organic framework (MOF), Sc2BDC3 (BDC = 1,4-benzenedicarboxylate), we used the sapphire capillary pressure cell to study changes in the framework structure on the uptake of n-pentane and isopentane. This work shows how the shape and smaller size of n-pentane facilitated the swelling of the framework that could be used to explain the increase in stability of the MOF to applied pressure. The effect of pressure on the previously unreported Cu-framework bis[1-(4- pyridyl)butane-1,3-dione]copper(II) (CuPyr-I) was investigated using high-pressure single-crystal diffraction techniques (DAC). CuPyr-I was found to exhibit high-pressure and low-temperature phase transitions, a pressure induced Jahn- Teller switch (which was hydrostatic medium dependent), piezochromism, and negative linear compressibility. Although each of these phenomena has been reported numerous times in a range of materials, this is to the best of our knowledge the first example to have been observed within the same material. The final two chapters investigate the exceptional thermal, chemical, and mechanical stability of a porous molecular crystal system (PMC) prepared by the co-crystallisation of a cobalt phthalocyanine derivative and a fullerene (C 60 or C70). The stabilising fullerene is captured in the cavity between two phthalocyanines in a ball and socket arrangement. These PMCs retain their porous structure: on the evacuation of solvent of crystalisation; on heating to over 500 K; on prolonged immersion in boiling aqueous acid, base, and water; and at extreme pressures of up to 5.85 GPa, the first reported high-pressure study of a PMC. the reactive cobalt cation is accessible via the massive interconnected voids, (8 nm3), as demonstrated by the adsorption and binding of CO and O2 to the empty metal site using in situ crystallographic methods available at beamline I19, Diamond Light Source.
2

The metabolomics of chronic stress

Sobsey, Constance Ananta 26 April 2016 (has links)
The World Health Organization has called stress-related illness “the health epidemic of the 21st century.” While the biochemical pathways associated with the acute stress response are well-characterized, many of the pathways behave differently under conditions of chronic stress. The purpose of this project is to apply high-sensitivity mass spectrometry (MS)-based targeted and untargeted metabolomics approaches to generate new insights into the biochemical processes and pathways associated with the chronic stress response, and potential mechanisms by which chronic stress produces adverse health effects. Chapter 1 describes the application of sets of targeted and untargeted metabolomics approaches to analyze serum samples from a human epigenetic model of chronic stress in order to identify potential targets for further analysis. To test the resulting hypothesis that oxidative stress is a key feature of chronic stress, a new targeted multiple reaction monitoring (MRM)-MS assay was developed for the accurate quantitation of aldehyde products of lipid peroxidation, as described in Chapter 2. In Chapter 3, the validated method for quantitation of malondialdehyde (MDA) was t applied to mouse plasma samples from a model of chronic social defeat stress to determine whether animals exposed to psychosocial stress show increases in oxidative stress. Mouse plasma samples from this model were also analyzed by untargeted metabolomics using Fourier-transform (FT)-MS to identify other important metabolite features, particularly those that overlap with metabolites identified in the human epigenetic model. Analysis of metabolomic data from two very different models of chronic stress supports the consistent detection of a metabolomic phenotype for chronic stress that is characterized by the dysregulation of energy metabolism associated with decreased concentrations of diacyl-phospholipids in blood. Increased blood concentrations of fatty acids, carnitines, acylcarnitines, and ether phospholipids were also observed. In addition to metabolites associated with energy metabolism, chronic stress also significantly influenced metabolites associated with amino acid metabolism and cell death. This characteristic pattern of differences in metabolite concentrations was observed in the plasma of mice exposed to chronic social defeat stress, irrespective of whether or not they displayed outward signs of a chronic stress response; In fact, mice that were “resilient” to the behavioural effects of chronic social defeat stress displayed an exaggerated phenotype over mice that showed depressive-like symptoms following chronic stress exposure. This may suggest that the observed changes in fatty acid composition are protective against stress. However, changes in fatty acid composition are also known to be associated with a wide variety of pathologies including heart disease, neurodegenerative diseases, and mood disorders, so the lipidomic changes associated with chronic stress may also contribute to its health impact. Overall, the results provide further evidence that changes in energy metabolism are a central part of allostatic adaptation to chronic stress. / Graduate / 0487 / csobsey@gmail.com
3

Inorganic polyphosphate in the marine environment: field observations and new analytical techniques

Diaz, Julia M. 31 March 2011 (has links)
Phosphorus (P) is a requirement for biological growth, but this vital nutrient is present at low or limiting concentrations across vast areas of the global surface ocean. Inorganic polyphosphate (poly-P), a linear polymer of at least three orthophosphate units, is one component of the marine P cycle that has been relatively overlooked as compared to other P species, owing in part to a lack of routine analytical techniques that cleanly evaluate it within samples. This thesis demonstrates that inorganic poly-P is a quantitatively significant and dynamic component of the global marine P cycle while also establishing two new techniques for its analysis in biological and environmental samples. In Chapter 2, experiments using the freshwater algae Chlamydomonas sp. and Chlorella sp. illustrate X-ray fluorescence spectromicroscopy as a powerful tool for the sub-micron scale assessment of poly-P composition in organisms. This method enabled the discovery, detailed in Chapter 3, of a mechanism for the long-term sequestration of the vital nutrient P from marine systems via the initial formation of poly-P in surface waters and its eventual transformation into the mineral apatite within sediments. The importance of marine poly-P is furthermore established in Chapter 3 by observations showing that naturally-occurring poly-P represents 7-11% of total P in particles and dissolved matter in Effingham Inlet, a eutrophic fjord located on Vancouver Island, British Columbia. In Chapter 4, a new fluorometric protocol based on the interaction of inorganic poly-P with 4',6-diamidino-2-phenylindole (DAPI) is established as a technique for the direct quantification of poly-P in environmental samples. Chapter 5 presents work from Effingham Inlet utilizing this method that show that inorganic poly-P plays a significant role in the redox-sensitive cycling of P in natural systems.
4

Implementace podnikového informačního systému: teorie a praxe / Implementation of business information system: Theory and Practice

Jonáš, Pavel January 2010 (has links)
The work try to familiarize the users with information systems in contemporary business and show the greates weakness in the implementation of information systems in enterprises. The issue of implementation is addressed in the work from all sides and angles. First, the implementation process itself analyzed and its parts. The following are general methods applicable to software development and evaluated their use in the implementation process. In addition to the implementation process is part of the work given to the management. In the work is discusses how to look at the company's management can influence the implementation process. Another problem to be solved are the key factors influencing the course and success of implementation. In addition of these factors are mentioned the most important risks affecting various stages of implementation. Based on these risks are created by three specific scenarios which could arise in individual cases of emergency. Gradually, readers are drawn into the issue and during the actual implementation in real practice. The individual parts are reminded of the most significant weaknesses or places that none of the parties should not be underestimated. In addition to the actual weaknesses of the implementation process are discussed and two basic Casework across the entire spectrum of companies. At the conclusion of the implementation process is displayed in a graphic sketch, where the individual problems of implementation are shown in the very way the implementation process. The reader can better imagine the implementation process, including the sequence of steps. Complete implementation issue is addressed in the work of all views of interested parties. So your company owner will find thinking about the implementation, or project manager who deals with the issue. The work is based on the author's years of experience in information systems implementations themselves.
5

Reconstruction and Local Recovery of Data from Synchronization Errors

Minshen Zhu (15334783) 21 April 2023 (has links)
<p>In this thesis we study the complexity of data recovery from synchronization errors, namely insertion and deletion (insdel) errors.</p> <p>Insdel Locally Decodable Codes (Insdel LDCs) are error-correcting codes that admit super-efficient decoding algorithms even in the presence of many insdel errors. The study of such codes for Hamming errors have spanned several decades, whereas work on the insdel analogue had amounted to only a few papers before our work. This work initiates a systematic study of insdel LDCs, seeking to bridge this gap through designing codes and proving limitations. Our upper bounds essentially match those for Hamming LDCs in important ranges of parameters, even though insdel LDCs are more general than Hamming LDCs. Our main results are lower bounds that are exponentially stronger than the ones inherited from the Hamming LDCs. These results also have implications for the well-studied variant of relaxed LDCs. For this variant, besides showing the first results in the insdel setting, we also answer an open question for the Hamming variant by showing a strong lower bound.</p> <p>In the trace reconstruction problem, the goal is to recover an unknown source string x \in {0,1}n from random traces, which are obtained by hitting the source string with random deletion/insertions at a fixed rate. Mean-based algorithms are a class of reconstruction algorithms whose outputs depend only on the empirical estimates of individual bits. The number of traces needed for mean-based trace reconstruction has already been settled. We further study the performance of mean-based algorithms in a scenario where one wants to distinguish between two source strings parameterized by their edit distance, and we also provide explicit construction of strings that are hard to distinguish. We further establish an equivalence to the Prouhet-Tarry-Escott problem from number theory, which ends up being an obstacle to constructing explicit hard instances against mean-based algorithms.</p>
6

Flexible and Data-Driven Modeling of 3D Protein Complex Structures

Charles W Christoffer (17482395) 30 November 2023 (has links)
<p dir="ltr">Proteins and their interactions with each other, with nucleic acids, and with other molecules are foundational to all known forms of life. The three-dimensional structures of these interactions are an essential component of a comprehensive understanding of how they function. Molecular-biological hypothesis formulation and rational drug design are both often predicated on a particular structure model of the molecule or complex of interest. While experimental methods capable of determining atomic-detail structures of molecules and complexes exist, such as the popular X-ray crystallography and cryo-electron microscopy, these methods require both laborious sample preparation and expensive instruments with limited throughput. Computational methods of predicting complex structures are therefore desirable if they can enable cheap, high-throughput virtual screening of the space of biological hypotheses. Many common biomolecular contexts have largely been blind spots for predictive modeling of complex structures. In this direction, docking methods are proposed to address extreme conformational change, nonuniform environments, and distance-geometric priors. Flex-LZerD deforms a flexible protein using a novel fitting procedure based on iterated normal mode decomposition and was shown to construct accurate complex models even when an initial input subunit structure exhibits extreme conformational differences from its bound state. Mem-LZerD efficiently constrains the docking search space by augmenting the geometric hashing data structure at the core of the LZerD algorithm and enabled membrane protein complexes to be efficiently and accurately modeled. Finally, atomic distance-based approaches developed during modeling competitions and collaborations with wet lab biologists were shown to effectively integrate domain knowledge into complex modeling pipelines.</p>
7

HIGHLY ACCURATE MACROMOLECULAR STRUCTURE COMPLEX DETECTION, DETERMINATION AND EVALUATION BY DEEP LEARNING

Xiao Wang (17405185) 17 November 2023 (has links)
<p dir="ltr">In life sciences, the determination of macromolecular structures and their functions, particularly proteins and protein complexes, is of paramount importance, as these molecules play critical roles within cells. The specific physical interactions of macromolecules govern molecular and cellular functions, making the 3D structure elucidation of these entities essential for comprehending the mechanisms underlying life processes, diseases, and drug discovery. Cryo-electron microscopy (cryo-EM) has emerged as a promising experimental technique for obtaining 3D macromolecular structures. In the course of my research, I proposed CryoREAD, an innovative AI-based method for <i>de nov</i>o DNA/RNA structure modeling. This novel approach represents the first fully automated solution for DNA/RNA structure modeling from cryo-EM maps at near-atomic resolution. However, as the resolution decreases, structure modeling becomes significantly more challenging. To address this challenge, I introduced Emap2sec+, a 3D deep convolutional neural network designed to identify protein secondary structures, RNA, and DNA information from cryo-EM maps at intermediate resolutions ranging from 5-10 Å. Additionally, I presented Alpha-EM-Multimer, a groundbreaking method for automatically building full protein complexes from cryo-EM maps at intermediate resolution. Alpha-EM-Multimer employs a diffusion model to trace the protein backbone and subsequently fits the AlphaFold predicted single-chain structure to construct the complete protein complex. Notably, this method stands as the first to enable the modeling of protein complexes with more than 10,000 residues for cryo-EM maps at intermediate resolution, achieving an average TM-Score of predicted protein complexes above 0.8, which closely approximates the native structure. Furthermore, I addressed the recognition of local structural errors in predicted and experimental protein structures by proposing DAQ, an evaluation approach for experimental protein structure quality that utilizes detection probabilities derived from cryo-EM maps via a pretrained multi-task neural network. In the pursuit of evaluating protein complexes generated through computational methods, I developed GNN-DOVE and DOVE, leveraging convolutional neural networks and graph neural networks to assess the accuracy of predicted protein complex structures. These advancements in cryo-EM-based structural modeling and evaluation methodologies hold significant promise for advancing our understanding of complex macromolecular systems and their biological implications.</p>
8

Säg är det möjligt för studie- och yrkesvägledare att motverka traditionella könsmönster?

Keynemo, Monica January 2011 (has links)
Säg är det möjligt att motverka traditionella könsmönster,trots strukturer som formar oss så att vi omedvetet styrs att väljautbildningar som leder till könstraditionella yrkesval? Att få kunskaper om ochge redskap för ett praktikorienterat jämställdhetsarbete i studie- ochyrkesvägledning är syftet med denna aktionsforskningsstudie. De vägledare som deltar har intresseför och kunskap om genusvetenskapliga perspektiv och kan ses som goda exempel. Vägledarnadeltar genom två intervjutillfällen och en månads fokusering på uppdraget attmotverka traditionella könsmönster. De använder olika metoder och berättar ompositiva, neutrala, obekväma, häftiga och negativa reaktioner från sökande. Resultatetbeskriver hur vägledarna ser på sitt uppdrag och hur de omsätter sina kunskaperi pedagogisk praktik och hur de bemöter och tolkar reaktioner som de får frånsökande. Under arbetets gång ökar den självinsikt som följer av att förstå hursvårt det är att inte göra kön. Metodutvecklinghar betydelse för att omvandla förhållningssätt till aktiv handling, men detavgörande är attityden till uppdraget. Slutsatser som dras är att uppdraget attmotverka traditionella könsmönster kan innebära att upptäcka dessa könsmönsteri vardagen, att vägra kategorisera utifrån kön och att inse att vi med hjälp avförändrade förväntningar tillsammans kan ändra det som ses som normalt. / Say is it possible to counteract traditional gender patterns, even if structures shape us so that we unconsciously are guided to choose courses that lead to gender-traditional career choices? To learn about and provide tools for a practice-oriented work with gender equality in educational and vocational guidance is the purpose of this action research. The counselors involved have an interest in and knowledge of gender perspectives and can be seen as good examples. They participate through two interview sessions and a monthly focus on the mission to counteract traditional gender patterns. They use different methods and reports of positive, neutral, awkward, violent and negative reactions from applicants. The results describe how counselors view their mission and how they apply their knowledge in pedagogical practice. It also shows how they respond to and interpret the reactions they receive from applicants. In the process they increase their self-awareness resulting from the understanding of how hard it is not to construct the gender stereotypes. Method development is important to transform attitudes to positive action, but what matters is the attitude towards the mission. Conclusions drawn are that the mission to counteract traditional gender patterns may mean to detect these gender patterns in everyday life, refuse to categorize on the basis of gender and realize that if we are changing expectations, together we can change what is seen as normal.
9

<b>Systems Modeling of host microbiome interactions in Inflammatory Bowel Diseases</b>

Javier E Munoz (18431688) 24 April 2024 (has links)
<p dir="ltr">Crohn’s disease and ulcerative colitis are chronic inflammatory bowel diseases (IBD) with a rising global prevalence, influenced by clinical and demographics factors. The pathogenesis of IBD involves complex interactions between gut microbiome dysbiosis, epithelial cell barrier disruption, and immune hyperactivity, which are poorly understood. This necessitates the development of novel approaches to integrate and model multiple clinical and molecular data modalities from patients, animal models, and <i>in-vitro</i> systems to discover effective biomarkers for disease progression and drug response. As sequencing technologies advance, the amount of molecular and compositional data from paired measurements of host and microbiome systems is exploding. While it is become routine to generate such rich, deep datasets, tools for their interpretation lag behind. Here, I present a computational framework for integrative modeling of microbiome multi-omics data titled: Latent Interacting Variable Effects (LIVE) modeling. LIVE combines various types of microbiome multi-omics data using single-omic latent variables (LV) into a structured meta-model to determine the most predictive combinations of multi-omics features predicting an outcome, patient group, or phenotype. I implemented and tested LIVE using publicly available metagenomic and metabolomics data set from Crohn’s Disease (CD) and ulcerative colitis (UC) status patients in the PRISM and LLDeep cohorts. The findings show that LIVE reduced the number of features interactions from the original datasets for CD to tractable numbers and facilitated prioritization of biological associations between microbes, metabolites, enzymes, clinical variables, and a disease status outcome. LIVE modeling makes a distinct and complementary contribution to the current methods to integrate microbiome data to predict IBD status because of its flexibility to adapt to different types of microbiome multi-omics data, scalability for large and small cohort studies via reliance on latent variables and dimensionality reduction, and the intuitive interpretability of the meta-model integrating -omic data types.</p><p dir="ltr">A novel application of LIVE modeling framework was associated with sex-based differences in UC. Men are 20% more likely to develop this condition and 60% more likely to progress to colitis-associated cancer compared to women. A possible explanation for this observation is differences in estrogen signaling among men and women in which estrogen signaling may be protective against UC. Extracting causal insights into how gut microbes and metabolites regulate host estrogen receptor β (ERβ) signaling can facilitate the study of the gut microbiome’s effects on ERβ’s protective role against UC. Supervised LIVE models<b> </b>ERβ signaling using high-dimensional gut microbiome data by controlling clinical covariates such as: sex and disease status. LIVE models predicted an inhibitory effect on ER-UP and ER-DOWN signaling activities by pairs of gut microbiome features, generating a novel of catalog of metabolites, microbial species and their interactions, capable of modulating ER. Two strongly positively correlated gut microbiome features: <i>Ruminoccocus gnavus</i><i> </i>with acesulfame and <i>Eubacterium rectale</i><i> </i>with 4-Methylcatechol were prioritized as suppressors ER-UP and ER-DOWN signaling activities. An <i>in-vitro</i> experimental validation roadmap is proposed to study the synergistic relationships between metabolites and microbiota suppressors of ERβ signaling in the context of UC. Two i<i>n-vitro</i> systems, HT-29 female colon cancer cell and female epithelial gut organoids are described to evaluate the effect of gut microbiome on ERβ signaling. A detailed experimentation is described per each system including the selection of doses, treatments, metrics, potential interpretations and limitations. This experimental roadmap attempts to compare experimental conditions to study the inhibitory effects of gut microbiome on ERβ signaling and how it could elevate or reduce the risk of developing UC. The intuitive interpretability of the meta-model integrating -omic data types in conjunction with the presented experimental validation roadmap aim to transform an artificial intelligence-generated big data hypothesis into testable experimental predictions.</p>
10

Interdisciplinary assessment of the potential for improving Integrated Pest Management practice in Scottish spring barley

Stetkiewicz, Stacia Serreze January 2018 (has links)
Integrated Pest Management (IPM) has long been promoted as a means of reducing reliance on pesticide inputs as compared to conventional farming systems. Reduced pesticide application could be beneficial due to the links between intensive pesticide use and negative impacts upon biodiversity and human health as well as the development of pesticide resistance. Work assessing the potential of IPM in cereal production is currently limited, however, and previous findings have generally covered the subject from the perspective of either field trial data or social science studies of farmer behaviour. This thesis attempts to help to address this knowledge gap by providing a more holistic assessment of IPM in Scottish spring barley production (selected because of its dominance in Scotland’s arable production systems), in relation to three of its most damaging fungal pathogens: Rhynchosporium commune, Blumeria graminis f.sp. hordei, and Ramularia collo-cygni. Several IPM techniques of potential relevance to the sector were identified, and the prospects of three in particular – crop rotation, varietal disease resistance, and forecasting disease pressure – were assessed in several ways. Preliminary analysis of experimental field trial data collected from 2011 – 2014 across Scotland found that the majority of spring barley trials in this period (65%) did not show a statistically significant impact of fungicide treatment on yield, with the average yield increase due to fungicide application being 0.62 t/ha. This initial analysis was expanded upon using stepwise regressions of long-term (1996 – 2014) field trial data from the same dataset. Here, the difference between treated and untreated yields could be explained by disease resistance, average seasonal rainfall (whereby wetter seasons saw an increased impact of fungicide use on yield), and high combined disease severity. Stakeholder surveying provided information about current practice and attitudes towards the selected IPM techniques amongst a group of 43 Scottish spring barley farmers and 36 agronomists. Stakeholders were broadly open to taking up IPM measures on farm; sowing of disease resistant varieties was most frequently selected as the best technique in terms of both practicality and cost, though individual preference varied. However, a disparity was seen between farmer perception of their uptake of IPM and actual, self-reported uptake for both varietal disease resistance and rotation. Farmers and agronomists also overestimated the impact of fungicide use as compared with the field trials results – the majority of stakeholders believed fungicide treatment to increase yields by 1 - 2 t/ha, while the majority of 2011 – 2014 field trials had a yield difference of under 1 t/ha. The reasons behind these differences between perception and practice are not currently known. Finally, an annual survey of commercial crops, gathered from 552 farms across Scotland (from 2009 – 2015), highlighted two gaps where IPM practice could be improved upon. Firstly, relatively few of the varieties listed in the commercial crops database were highly resistant to the three diseases – 26.1% were highly resistant to Ramularia, 14.2% to Rhynchosporium, and 58.1% to mildew. Secondly, 71% of the farms included in the database had planted barley in at least two consecutive seasons, indicating that crop rotation practices could be improved. The overarching finding of this project is that there is scope for IPM uptake to be improved upon and fungicide use to be reduced while maintaining high levels of yield in Scottish spring barley production. Incorporating experimental field data, stakeholder surveying, and commercial practice data offered a unique view into the potential for IPM in this sector, and provided insights which could not have been gained through the lens of a single discipline.

Page generated in 0.0687 seconds