Spelling suggestions: "subject:"cow density"" "subject:"cow clensity""
281 |
The impact of Niacin on PCSK9 levels in vervet monkeys (Chlorocebus aethiops)Ngqaneka, Thobile January 2020 (has links)
Magister Pharmaceuticae - MPharm / Cardiovascular diseases (CVDs) such as ischaemic heart diseases, heart failure and stroke remain a major cause of death globally. Various deep-rooted factors influence CVD development; these include but are not limited to elevated blood lipids, high blood pressure, obesity and diabetes. A considerable number of proteins are involved directly and indirectly in the transport, maintenance and elimination of plasma lipids, including high and low-density lipoprotein cholesterol (HDL-C and LDL-C). There are several mechanisms involved in the removal of LDL particles from systemic circulation. One such mechanism is associated with the gene that encodes proprotein convertase subtilisin/kexin type 9 (PCSK9), which has become an exciting therapeutic target for the reduction of residual risk of CVDs. Currently, statins are the mainstay treatment to reduce LDL-C, and a need exists to further develop more effective LDL-C-lowering drugs that might supplement statins. This study was aimed at contributing to the generation of knowledge regarding the effect of niacin in reducing LDL levels through PCSK9 interaction. The aims/objectives of this study were achieved by utilizing two approaches, which included animal intervention with niacin followed by genetic screening of five prioritized genes involved in cholesterol synthesis and regulation. For animal intervention, 16 vervet monkeys were divided into two groups of eight animals consisting of a control and an experimental (niacin) group. The control group was given a normal standard diet of pre-cooked maize meal throughout the study, while the experimental group received the same diet supplemented with 100 mg/kg of niacin (SR) for 12 weeks. During the niacin intervention, blood was collected at baseline, every four weeks during the treatment period and the end of the washout period. The collected blood was used for biochemical analysis (total cholesterol, triglycerides, LDL-C, and HDL-C) and downstream genetic applications. The second phase included the screening of PCSK9, LDLR, SREBP-2, CETP and APOB-100 using genotyping and gene expression. Niacin administration produced statistically significant increases in plasma HDL-C at fourtime points (T1, T2, T3 and T4), which resulted in an overall increase in plasma HDL-C. Additionally, niacin administration resulted in a slight reduction in LDL-C and total cholesterol levels. Furthermore, the genotyping analysis revealed 13 sequence variants identified in
PCSK9, LDLR, SREBP-2, CETP and APOB-100 genes. Five of these variants were predicted to be disease-causing and correlated with gene expression patterns. Three identified PCSK9 variants (H177N, R148S, G635G) were categorized as LOF mutations, and this was supported
by a decline in gene expression in animals harbouring these variants. The LDLR also had LOF variants that were the reason for its decreased mRNA expression. Additionally, SREBP-2 proved to be a key mediator of cholesterol pathways. Therefore, the findings of the study
conclusively suggest that niacin does increase HDL-C and decrease LDL-C and total cholesterol. Moreover, an interaction between niacin administration and PCSK9 was observed which resulted in decreased gene expression.
|
282 |
Etude des modifications de l'apolipoprotéine B-100 induites par la myéloperoxydase à l'aide de la chromatographie liquide couplée à la spectrométrie de masseDelporte, Cédric 14 September 2012 (has links)
Les maladies cardiovasculaires constituent la première cause de décès dans le monde et l’athérosclérose est le premier facteur causal de ces maladies. Parmi les multiples facteurs de risque athéromateux, un facteur est souvent décrit :la modification des lipoprotéines de basse densité (LDLs). Bien que le phénomène d’athérogénèse ne soit pas encore complètement résolu, il est actuellement admis que les LDLs natives passent la paroi vasculaire et s’accumulent au niveau sous-endothélial où elles sont oxydées et endocytées par les macrophages. Une théorie plus récente indique que les LDLs peuvent être également modifiées dans la circulation.<p>Néanmoins, le processus par lequel ces lipoprotéines sont modifiées reste hautement controversé. Depuis quelques années, le modèle de modification des LDLs par la myéloperoxydase est apparu comme un modèle physiopathologique contrairement au modèle longuement utilisé de l’oxydation des LDLs par le cuivre. La myéloperoxydase est une enzyme présente dans les granules primaires des neutrophiles mais qui lors d’inflammations chroniques, comme dans l’athérosclérose, peut se retrouver dans le milieu extracellulaire et former un oxydant puissant qui attaque les protéines, les lipides ou les acides nucléiques. Les LDLs modifiées par la myéloperoxydase ne sont plus reconnues par le récepteur membranaire spécifique pour les LDLs. De plus, très peu d’études ont décrit à ce jour les modifications apportées par la myéloperoxydase aux LDLs.<p>Dans ce contexte, nous avons étudié la spécificité de la myéloperoxydase à modifier les LDLs. Dans ce modèle, la partie protéique de la lipoprotéine est majoritairement touchée. C’est pourquoi nous avons développé et optimisé des méthodes d’analyse par spectrométrie de masse de l’apolipoprotéine B-100, la seule protéine de la LDL. De plus, l’activité de la myéloperoxydase à la surface des LDLs a également été investiguée. <p>Les résultats de ce travail montrent que la myéloperoxydase s’attaque de manière spécifique aux LDLs et que le modèle chimique utilisant de l’acide hypochloreux pour mimer l’action de la myéloperoxydase n’est pas parfait. Enfin, nous avons également observé des changements dans l’activité enzymatique lorsque la myéloperoxydase est adsorbée à la surface des LDLs.<p> / Doctorat en Sciences biomédicales et pharmaceutiques / info:eu-repo/semantics/nonPublished
|
283 |
Practice Patterns in Treating High-Risk Patients With Hyperlipidemia at a Northeast Tennessee University ClinicIsmail, Hassan M., Simmons, Christina, Pfortmiller, Deborah 01 January 2005 (has links)
Background: This study was conducted to test the hypothesis that internal medicine residents at a northeast Tennessee university clinic were not compliant with the latest National Cholesterol Educational Program Adult Treatment Panel (NCEP-ATP) guidelines in treating hyperlipidemia in patients with diabetes and coronary artery disease. Methods: A retrospective medical record survey was conducted to evaluate residents' pattern in lowering low-density lipoprotein (LDL) cholesterol to below 100 mg/dL in patients with diabetes and coronary artery disease. The survey covered a 5-year period, from July 1998 to June 2003, and included 15 randomly chosen residents who were in training for 3 consecutive years. Charts were randomly selected from residents' clinics using International Classification of Diseases-9 codes for coronary artery disease or diabetes mellitus with hyperlipidemia. Five hundred fifty charts were reviewed. Only 41 (7.45%) met the inclusion criteria. Results: Analysis of data using Epi-Info 2002 (Centers for Disease Control and Prevention, Atlanta, GA) revealed that only 68.3% of patients with diabetes and coronary artery disease reached target LDL cholesterol levels. Of the patients who reached target levels, only 42.9% maintained them. Analysis of variance and chi-square tests revealed that the frequency of cholesterol measurement, but not the frequency of physicians' visits, was associated with a higher likelihood of reaching the target LDL level. Conclusion: There was a suboptimal compliance among internal medicine residents in the frequency of screening for, reaching, and maintaining the target LDL cholesterol level, according to the latest NCEP-ATP guidelines, among high-risk patients with hyperlipidemias.
|
284 |
Applications of Mathematical Optimization Methods to Digital Communications and Signal ProcessingGiddens, Spencer 29 July 2020 (has links)
Mathematical optimization is applicable to nearly every scientific discipline. This thesis specifically focuses on optimization applications to digital communications and signal processing. Within the digital communications framework, the channel encoder attempts to encode a message from a source (the sender) in such a way that the channel decoder can utilize the encoding to correct errors in the message caused by the transmission over the channel. Low-density parity-check (LDPC) codes are an especially popular code for this purpose. Following the channel encoder in the digital communications framework, the modulator converts the encoded message bits to a physical waveform, which is sent over the channel and converted back to bits at the demodulator. The modulator and demodulator present special challenges for what is known as the two-antenna problem. The main results of this work are two algorithms related to the development of optimization methods for LDPC codes and the two-antenna problem. Current methods for optimization of LDPC codes analyze the degree distribution pair asymptotically as block length approaches infinity. This effectively ignores the discrete nature of the space of valid degree distribution pairs for LDPC codes of finite block length. While large codes are likely to conform reasonably well to the infinite block length analysis, shorter codes have no such guarantee. Chapter 2 more thoroughly introduces LDPC codes, and Chapter 3 presents and analyzes an algorithm for completely enumerating the space of all valid degree distribution pairs for a given block length, code rate, maximum variable node degree, and maximum check node degree. This algorithm is then demonstrated on an example LDPC code of finite block length. Finally, we discuss how the result of this algorithm can be utilized by discrete optimization routines to form novel methods for the optimization of small block length LDPC codes. In order to solve the two-antenna problem, which is introduced in greater detail in Chapter 2, it is necessary to obtain reliable estimates of the timing offset and channel gains caused by the transmission of the signal through the channel. The timing offset estimator can be formulated as an optimization problem, and an optimization method used to solve it was previously developed. However, this optimization method does not utilize gradient information, and as a result is inefficient. Chapter 4 presents and analyzes an improved gradient-based optimization method that solves the two-antenna problem much more efficiently.
|
285 |
Codes correcteurs quantiques pouvant se décoder itérativement / Iteratively-decodable quantum error-correcting codesMaurice, Denise 26 June 2014 (has links)
On sait depuis vingt ans maintenant qu'un ordinateur quantique permettrait de résoudre en temps polynomial plusieurs problèmes considérés comme difficiles dans le modèle classique de calcul, comme la factorisation ou le logarithme discret. Entre autres, un tel ordinateur mettrait à mal tous les systèmes de chiffrement à clé publique actuellement utilisés en pratique, mais sa réalisation se heurte, entre autres, aux phénomènes de décohérence qui viennent entacher l'état des qubits qui le constituent. Pour protéger ces qubits, on utilise des codes correcteurs quantiques, qui doivent non seulement être performants mais aussi munis d'un décodage très rapide, sous peine de voir s'accumuler les erreurs plus vite qu'on ne peut les corriger. Une solution très prometteuse est fournie par des équivalents quantiques des codes LDPC (Low Density Parity Check, à matrice de parité creuse). Ces codes classiques offrent beaucoup d'avantages : ils sont faciles à générer, rapides à décoder (grâce à un algorithme de décodage itératif) et performants. Mais leur version quantique se heurte (entre autres) à deux problèmes. On peut voir un code quantique comme une paire de codes classiques, dont les matrices de parité sont orthogonales entre elles. Le premier problème consiste alors à construire deux « bons » codes qui vérifient cette propriété. L'autre vient du décodage : chaque ligne de la matrice de parité d'un des codes fournit un mot de code de poids faible pour le second code. En réalité, dans un code quantique, les erreurs correspondantes sont bénignes et n'affectent pas le système, mais il est difficile d'en tenir compte avec l'algorithme de décodage itératif usuel. On étudie dans un premier temps une construction existante, basée sur un produit de deux codes classiques. Cette construction, qui possède de bonnes propriétés théoriques (dimension et distance minimale), s'est avérée décevante dans les performances pratiques, qui s'expliquent par la structure particulière du code produit. Nous proposons ensuite plusieurs variantes de cette construction, possédant potentiellement de bonnes propriétés de correction. Ensuite, on étudie des codes dits q-Aires~: ce type de construction, inspiré des codes classiques, consiste à agrandir un code LDPC existant en augmentant la taille de son alphabet. Cette construction, qui s'applique à n'importe quel code quantique 2-Régulier (c'est-À-Dire dont les matrices de parité possèdent exactement deux 1 par colonne), a donné de très bonnes performances dans le cas particulier du code torique. Ce code bien connu se décode usuellement très bien avec un algorithme spécifique, mais mal avec l'algorithme usuel de propagation de croyances. Enfin, un équivalent quantique des codes spatialement couplés est proposé. Cette idée vient également du monde classique, où elle améliore de façon spectaculaire les performances des codes LDPC : le décodage s'effectue en temps quasi-Linéaire et atteint, de manière prouvée, la capacité des canaux symétriques à entrées binaires. Si dans le cas quantique, la preuve éventuelle reste encore à faire, certaines constructions spatialement couplées ont abouti à d'excellentes performances, bien au-Delà de toutes les autres constructions de codes LDPC quantiques proposées jusqu'à présent. / Quantum information is a developping field of study with various applications (in cryptography, fast computing, ...). Its basic element, the qubit, is volatile : any measurement changes its value. This also applies to unvolontary measurements due to an imperfect insulation (as seen in any practical setting). Unless we can detect and correct these modifications, any quantum computation is bound to fail. These unwanted modifications remind us of errors that can happen in the transmission of a (classical) message. These errors can be accounted for with an error-Correcting code. For quantum errors, we need to set quantum error-Correcting codes. In order to prevent the clotting of errors that cannot be compensated, these quantum error-Correcting codes need to be both efficient and fast. Among classical error-Correcting codes, Low Density Parity Check (LDPC) codes provide many perks: They are easy to create, fast to decode (with an iterative decoding algorithme, known as belief propagation) and close to optimal. Their quantum equivalents should then be good candidates, even if they present two major drawbacks (among other less important ones). A quantum error correction code can be seen as a combination of two classical codes, with orthogonal parity-Check matrices. The first issue is the building of two efficient codes with this property. The other is in the decoding: each row of the parity-Check matrix from one code gives a low-Weight codeword of the other code. In fact, with quantum codes, corresponding errors do no affect the system, but are difficult to account for with the usual iterative decoding algorithm. In the first place, this thesis studies an existing construction, based on the product of two classical codes. This construction has good theoritical properties (dimension and minimal distance), but has shown disappointing practical results, which are explained by the resulting code's structure. Several variations, which could have good theoritical properties are also analyzed but produce no usable results at this time. We then move to the study of q-Ary codes. This construction, derived from classical codes, is the enlargement of an existing LDPC code through the augmentation of its alphabet. It applies to any 2-Regular quantum code (meaning with parity-Check matrices that have exactly two ones per column) and gives good performance with the well-Known toric code, which can be easily decoded with its own specific algorithm (but not that easily with the usual belief-Propagation algorithm). Finally this thesis explores a quantum equivalent of spatially coupled codes, an idea also derived from the classical field, where it greatly enhances the performance of LDPC codes. A result which has been proven. If, in its quantum form, a proof is still not derived, some spatially-Coupled constructions have lead to excellent performance, well beyond other recent constuctions.
|
286 |
Use of Pyrolyzed Soybean Hulls as Fillers in PolyolefinsCoben, Collin 09 July 2020 (has links)
No description available.
|
287 |
Amyloid beta inducerad klyvning av NG2 medierad via LRP-1 receptornHallberg, Anna January 2014 (has links)
Bakgrund: Deposition av fibrillär amyloid beta 1-42 (Aβ) i hjärnan är ett välkänt kännetecken för den neurodegenerativa sjukdomen Alzheimer’s (AD). Dessa ansamlingar påverkar pericyter, en celltyp involverad i blodkärlsfunktion och upprätthållande av blodhjärnbarriären (BBB). Pericyter uttrycker både receptorn low density lipoprotein receptor related protein 1 (LRP-1) till vilken Aβ1-42 binder, och proteoglykanet NG2. NG2 har stor betydelse för pericyters samspel med endotelceller och i sin lösliga form (sNG2) främjar den angiogenes. Tidigare studier har visat att mängden NG2 som klyvs från pericyter förändras när de stimuleras med Aβ1-42. Syfte: Att undersöka om Aβ1-42 påverkar NG2 klyvning via LRP-1 Metod: Human brain vascular pericytes (HBVP) stimulerades med monomera, oligomera och fibrillära Aβ1-42 preparationer. Uttrycket av LRP-1 tystades med small interfering (si) LRP-1 och knockdown efficiency analyserades med Western Blot (WB). Även Aβ1-42 preparationer undersöktes med WB. Cellviabilitet mättes med laktatdehydrogenas (LDH) test och proteininnehåll med Bradford analys. Slutligen mättes mängden sNG2 i pericytmedium med hjälp av enzyme-linked immunosorbant assay (ELISA) baserad på electrochemiluminescence (Mesoscale). Resultat: Preparationerna med monomer och oligomer Aβ1-42 ökade NG2 klyvning. Denna Aβ1-42 inducerade ökning försvann när cellernas LRP-1 tystats. Aβ1-42 fibrillpreparationerna inhiberade däremot NG2 klyvningen oavsett närvaro av LRP-1. Aβ1-42 monomerpreparationer inducerade celldöd hos HBVP med LRP-1 men inte hos de HBVP där LRP-1 tystats, och cellviabiliteten hos HBVP ökade hos celler som stimulerats med Aβ1-42 fibrillpreparation och där LRP-1 tystats. Konklusion: Resultaten visar att Aβ1-42 monomer och oligomer påverkar NG2 klyvning via LRP-1. Däremot verkar Aβ1-42 fibrill istället påverka NG2 klyvning via en annan signalväg. Studien belyser hur Aβ1-42 kan påverka pericyter, vilket kan föreligga vaskulära förändringar kopplade till AD patologi. / Background: The deposition of fibrillar amyloid beta 1-42 (Aβ) in the brain is a well-known characteristic for the neurodegenerative Alzheimer’s disease (AD). These accumulations affect pericytes, a cell type implicated in vessel function and maintenance of the blood-brain barrier (BBB). Pericytes express both the receptor low-density lipoprotein receptor related protein 1 (LRP-1), to which Aβ1-42 binds, and the proteoglycan NG2. NG2 is important for the interaction between pericytes and endothelial cells and in its soluble form (sNG2) it promotes angiogenesis. Earlier studies have shown that the amount of NG2 shed from pericytes is altered when stimulated with Aβ1-42. Purpose: To investigate whether the Aβ1-42 influence on NG2 shedding is mediated via LRP-1. Method: Human brain vascular pericytes (HBVP) were stimulated with monomeric, oligomeric and fibrillar preparations of Aβ1-42. Expression of LRP-1 was knocked down by small interfering (si) LRP-1 silencing and knockdown efficiency was analysed with Western blot (WB). Aβ1-42 preparations were also analysed with WB. Cell viability was measured with lactate dehydrogenase (LDH) test and protein concentrations were determined with Bradford assay. Finally the amount of sNG2 in pericytemedium was measured with enzyme-linked immunosorbant assay (ELISA) baserad på electrochemiluminescence (Mesoscale) Results: Monomer and oligomer Aβ1-42 increased NG2 shedding, this Aβ1-42 induced increase was not found in HBVP with a silenced LRP-1. In contrast, fibrillar Aβ1-42 inhibited NG2 shedding regardless of LRP-1 presence. Monomer Aβ1-42 preparations induced cell death of HBVP with LRP-1 but not of HBVP without LRP-1, and cell viability of HBVP lacking LRP-1 was increased after fibrillar Aβ1-42 exposure. Conclusion: The results indicate a monomeric and oligomeric Aβ1-42 induced impact on NG2 shedding via LRP-1. However it appears as if fibrillar Aβ1-42 doesn’t affect NG2 shedding via LRP-1 but rather inhibits the process via another unknown receptor. The study highlights how Aβ1-42 can affect pericytes, which may underlie the vascular changes linked to AD pathology.
|
288 |
Differences in Lipid Profiles and Atherogenic Indices Between Hypertensive and Normotensive Populations: A Cross-Sectional Study of 11 Chinese CitiesCheng, Wenke, Wang, Lili, Chen, Siwei 03 July 2023 (has links)
Background: Several previous studies have reported that dyslipidemia is associated
with the risk of hypertension, but these studies are mainly conducted in European and
US populations, with a very few studies in the Asian population. Moreover, the effects
of atherosclerotic indices, including atherogenic coefficient (AC) and atherogenic risk of
plasma (AIP), on hypertension in Asians have not been well described so far.
Methods: From 2010 to 2016, altogether 211,833 Chinese adults were ultimately
recruited at the health centers in 11 Chinese cities (including Shanghai, Beijing,
Nanjing, Suzhou, Shenzhen, Changzhou, Chengdu, Guangzhou, Hefei, Wuhan, and
Nantong). Differences in continuous variables between the two groups were analyzed
by the Mann–Whitney test, while those in categorical variables were examined by the
Chi-squared test. Logistic regression was applied to evaluate the association between
lipid profiles and the risk of hypertension. The predictive values of AC and AIP for the
incidence of hypertension were analyzed using the area under the receiver operating
characteristic (ROC) curve. Meanwhile, Bayesian network (BN) models were performed
to further analyze the associations between the different covariates and the incidence
of hypertension.
Results: A total of 117,056 participants were included in the final analysis. There
were significant differences in baseline characteristics between normotension and
hypertension groups (p < 0.001). In multivariate logistic regression, the risk of
hypertension increased by 0.2% (1.002 [1.001–1.003]), 0.2% (1.002 [1.001–1.003]), and
0.2% (1.002 [1.001–1.003]) per 1 mg/dl increase in total cholesterol (TC), low-density
lipoprotein (LDL), and non-high-density lipoprotein cholesterol (non-HDL-c), respectively.
However, after adjusting for body mass index (BMI), an increase in HDL level was
associated with a higher risk of hypertension (p for a trend < 0.001), and the risk of
hypertension increased by 0.6% per 1 mg/dl increase in HDL-c (1.006 [1.003–1.008]).
In women, AC had the highest predictive value for the incidence of hypertension with
an area under the curve (AUC) of 0.667 [95% confidence interval (CI): 0.659–0.674].
BN models suggested that TC and LDL were more closely related to the incidence
of hypertension.
Conclusions: Overall, lipid profiles were significantly abnormal in the hypertensive
population than in the normotensive population. TC and LDL were strongly associated
with the incidence of hypertension. TC, LDL, and non-HDL-c levels show a positive
association, HDL-c shows a negative association, while TG is not significantly associated
with the risk of hypertension. After adjusting for BMI, HDL-c turns out to be positively
associated with the risk of hypertension. In addition, AC has a good predictive value for
the incidence of hypertension in women.
|
289 |
The role of the low-density lipoprotein receptor in transport and metabolism of LDL through the wall of normal rabbit aorta in vivo. Estimation of model parameters from optimally designed dual-tracer experimentsMorris, Evan Daniel January 1991 (has links)
No description available.
|
290 |
Meso-Scale Wetting of Paper TowelsAbedsoltan, Hossein 31 July 2017 (has links)
No description available.
|
Page generated in 0.0507 seconds