• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 25
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 71
  • 18
  • 16
  • 16
  • 15
  • 11
  • 10
  • 10
  • 10
  • 10
  • 10
  • 10
  • 9
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Assessment of the effect of place of selection on performance of health posts and turnover of health extension workers in Jimma Zone, Ethiopia

Aman, Hagos Amir January 2012 (has links)
Magister Public Health - MPH / The Health Extension Program (HEP) was initiated under the Health Sector Development Program (HSDP II) in 2002/03. The central philosophy of these initiatives was based on the belief that if the right knowledge and skill is transferred, each household can take responsibility for producing and maintaining its own health. The HEP is delivered through Health Extension Workers (HEWs), who are local women and have completed grade 10. Recruitment of these workers is conducted by kebele (village) and woreda (district) councils. Following this they are provided with a one year training prior to being employed by the district health office. The HEP guideline states that all HEWs should be assigned to a health post within their own community.The rationale for this requirement is that health policy makers and mangers believe that the deployment of non-local HEWs results in poor performance and turnover. However, there is no evidence to support this assumption. This study was conducted to better understand the effect of place of selection on the performance of health posts and turnover of Health Extension Workers in Jimma Zone, Ethiopia.A cross sectional analytical study design was utilized to assess the effect of place of selection on the performance of health posts and turnover of HEWs in a randomly selected sample of six districts of Jimma Zone. A systematic record review on the activity reports for the Ethiopian Fiscal Year (EFY) 2003 was conducted on all selected health posts in Jimma Zone(239 randomly selected health posts from all functional rural and urban health posts in Jimma Zone) obtained from the district health office. Descriptive statistics was computed to describe the socio demographic characteristics and the level of performance. Chi-square test was performed to test the relationship amongst the variables.The finding from this study showed that HEWs who are assigned outside of their communities performed as well or even better than those recruited from the same communities.The differences between the relationship of staff turnover and retention couldn‟t be estimated due to limited availability of information related to this factor. Overall, despite the widely held opinion among policy makers that recruiting HEWs from the same community enhances their performance; there is little empirical evidence to support this argument based on the findings from this study. Thus, it is essential to explore additional factors and criteria in the selection and recruitment process beyond residence-based measures in the expectation to enhance the performance of the HEWs.
62

Adaptção de linhagens celulares humanas para crescimento em suspensão e meios de cultura livres de soro fetal bovino / Serum-free suspension adaptation of human cell lines

Biaggio, Rafael Tagé 28 March 2014 (has links)
Linhagens celulares humanas têm atraído grande interesse devido a sua capacidade de glicosilar proteínas de maneira mais semelhante às proteínas nativas humanas, reduzindo o potencial de respostas imunológicas contra epítopos não humanos. No entanto, por se tratar de uma aplicação recente, essas células ainda não foram extensamente caracterizadas e cultivadas em condições reprodutíveis da escala industrial, ou seja, em suspensão e em meios de cultura livres de soro fetal bovino (SFB). Em função disso, o objetivo principal deste trabalho foi estabelecer culturas livres de SFB e em suspensão para as linhagens celulares humanas SK-Hep-1, HepG2 e HKB-11, que têm despertado grande interesse devido ao potencial de produção de proteínas recombinantes. Para isso, quatro formulações comerciais livres de SFB foram avaliadas. As células que apresentaram bons resultados na adaptação aos meios realizada em garrafas estáticas foram então adaptadas para crescimento em suspensão. Foi possível realizar a adaptação satisfatória da célula HKB-11 ao meio FreeStyle e da célula SK-Hep-1 ao meio SFMII bem como a criopreservação das mesmas também em condições livres de SFB. A caracterização cinética das células adaptadas mostrou que a célula HKB-11 apresentou concentração celular quatro vezes superior a da célula SK-Hep-1 (8,6x106 e 1,9x106 células/mL, respectivamente) e apresentou crescimento celular durante 18 dias em cultura. A velocidade específica de crescimento máxima (?max) foi semelhante nas duas células (0,0159 h-1 para a HKB-11 e 0,0186 h-1 para SK-Hep-1). A limitação do crescimento das células adaptadas não parece estar associada à exaustão de glicose e glutamina, tampouco à formação de lactato em concentrações inibitórias. Todavia, para ambos os casos, foi observada produção de amônia em concentrações consideradas inibitórias (2 - 5 mM). De maneira geral, foi possível estabelecer culturas celulares em condições compatíveis com o desenvolvimento de um bioprocesso reprodutível, seguro e em concordância com as boas práticas de fabricação. / Human cell lines have attracted great interest since they are capable of producing glycosylated proteins in a more similar way to native human proteins, reducing the potential for immune responses against non-human epitopes. However, these human cell lines have not been extensively characterized and cultured in large scale and in serum-free suspension conditions. As a result, the main objective of this work was to adapt three human cell lines: SK-Hep-1, HepG2 and HKB-11 to serum-free suspension cultures, since they are promising systems of recombinant protein expression. For this task, four commercial serum-free media were tested. Adapted cell lines in T-flasks were further adapted to suspension cultures. Results showed that both HKB-11 and SK-Hep-1 were adapted to serum-free suspension cultures in FreeStyle and SFMII, respectively and were cryopreservated in serum-free formulations. Kinetic characterization showed that HKB-11 cell concentration was four times higher than SK-Hep-1 cell (8,6x106 and 1,9x106 cells/ml, respectively) and showed cell growth in culture over 18 days. The maximum specific growth rate (?max) was similar for both cell lines (0,0159 h-1 to HKB-11 and 0,0186h-1 to SK-Hep-1). Growth limitation of adapted human cell lines does not seem to be associated with depletion of glucose and glutamine, nor with the formation of lactate in inhibitory concentrations. However, in both cases, ammonia production achieved inhibitory concentrations (2 - 5 mM). In general, it was possible to establish human cell cultures that are compatible with reproducible and safe bioprocess conditions and in compliance with good manufacturing practices.
63

Adaptção de linhagens celulares humanas para crescimento em suspensão e meios de cultura livres de soro fetal bovino / Serum-free suspension adaptation of human cell lines

Rafael Tagé Biaggio 28 March 2014 (has links)
Linhagens celulares humanas têm atraído grande interesse devido a sua capacidade de glicosilar proteínas de maneira mais semelhante às proteínas nativas humanas, reduzindo o potencial de respostas imunológicas contra epítopos não humanos. No entanto, por se tratar de uma aplicação recente, essas células ainda não foram extensamente caracterizadas e cultivadas em condições reprodutíveis da escala industrial, ou seja, em suspensão e em meios de cultura livres de soro fetal bovino (SFB). Em função disso, o objetivo principal deste trabalho foi estabelecer culturas livres de SFB e em suspensão para as linhagens celulares humanas SK-Hep-1, HepG2 e HKB-11, que têm despertado grande interesse devido ao potencial de produção de proteínas recombinantes. Para isso, quatro formulações comerciais livres de SFB foram avaliadas. As células que apresentaram bons resultados na adaptação aos meios realizada em garrafas estáticas foram então adaptadas para crescimento em suspensão. Foi possível realizar a adaptação satisfatória da célula HKB-11 ao meio FreeStyle e da célula SK-Hep-1 ao meio SFMII bem como a criopreservação das mesmas também em condições livres de SFB. A caracterização cinética das células adaptadas mostrou que a célula HKB-11 apresentou concentração celular quatro vezes superior a da célula SK-Hep-1 (8,6x106 e 1,9x106 células/mL, respectivamente) e apresentou crescimento celular durante 18 dias em cultura. A velocidade específica de crescimento máxima (?max) foi semelhante nas duas células (0,0159 h-1 para a HKB-11 e 0,0186 h-1 para SK-Hep-1). A limitação do crescimento das células adaptadas não parece estar associada à exaustão de glicose e glutamina, tampouco à formação de lactato em concentrações inibitórias. Todavia, para ambos os casos, foi observada produção de amônia em concentrações consideradas inibitórias (2 - 5 mM). De maneira geral, foi possível estabelecer culturas celulares em condições compatíveis com o desenvolvimento de um bioprocesso reprodutível, seguro e em concordância com as boas práticas de fabricação. / Human cell lines have attracted great interest since they are capable of producing glycosylated proteins in a more similar way to native human proteins, reducing the potential for immune responses against non-human epitopes. However, these human cell lines have not been extensively characterized and cultured in large scale and in serum-free suspension conditions. As a result, the main objective of this work was to adapt three human cell lines: SK-Hep-1, HepG2 and HKB-11 to serum-free suspension cultures, since they are promising systems of recombinant protein expression. For this task, four commercial serum-free media were tested. Adapted cell lines in T-flasks were further adapted to suspension cultures. Results showed that both HKB-11 and SK-Hep-1 were adapted to serum-free suspension cultures in FreeStyle and SFMII, respectively and were cryopreservated in serum-free formulations. Kinetic characterization showed that HKB-11 cell concentration was four times higher than SK-Hep-1 cell (8,6x106 and 1,9x106 cells/ml, respectively) and showed cell growth in culture over 18 days. The maximum specific growth rate (?max) was similar for both cell lines (0,0159 h-1 to HKB-11 and 0,0186h-1 to SK-Hep-1). Growth limitation of adapted human cell lines does not seem to be associated with depletion of glucose and glutamine, nor with the formation of lactate in inhibitory concentrations. However, in both cases, ammonia production achieved inhibitory concentrations (2 - 5 mM). In general, it was possible to establish human cell cultures that are compatible with reproducible and safe bioprocess conditions and in compliance with good manufacturing practices.
64

Search for Higgs boson decays to beyond-the-Standard-Model light bosons in four-lepton events with the ATLAS detector at the LHC

Chiu, Justin 22 December 2020 (has links)
This thesis presents the search for the dark sector process h -> Zd Zd -> 4l in events collected by the ATLAS detector at the Large Hadron Collider in 2015--2018. In this theorized process, the Standard Model Higgs boson (h) decays to four leptons via two intermediate Beyond-the-Standard-Model particles each called Zd. This process arises from interactions of the Standard Model with a dark sector. A dark sector consists of one or more new particles that have limited or zero interaction with the Standard Model, such as the new vector boson Zd (dark photon). It could have a rich and interesting phenomenology like the visible sector (the Standard Model) and could naturally address many outstanding problems in particle physics. For example, it could contain a particle candidate for dark matter. In particular, Higgs decays to Beyond-the-Standard-Model particles are well-motivated theoretically and are not tightly constrained; current measurements of Standard Model Higgs properties permit the fraction of such decays to be as high as approximately 30%. The results of this search do not show evidence for the existence of the h -> Zd Zd -> 4l process and are therefore interpreted in terms of upper limits on the branching ratio B(h -> Zd Zd) and the effective Higgs mixing parameter kappa^prime. / Graduate
65

Purifica??o e caracteriza??o de uma ?-N-acetillhexosaminadase extra?da do mam?fero marinho Sotalia fluviatilis

Gomes J?nior, Jos? Edilson 06 December 2006 (has links)
Made available in DSpace on 2014-12-17T14:03:42Z (GMT). No. of bitstreams: 1 JoseEGJ.pdf: 604842 bytes, checksum: a34879bd40606d248f800a49ed111824 (MD5) Previous issue date: 2006-12-06 / This report shows 2232 times purification of a βNAcetylhexosaminidase from hepatic extracts from the sea mammal Sotalia fluviatilis homogenate with final recovery of 8,4%. Sequenced steps were utilized for enzyme purification: ammonium sulfate fractionation, Biogel A 1.5 m, chitin, DEAESepharose and hydroxyapatite chromatographies. The protein molecular mass was estimated in 10 kDa using SDSPAGE and confirmed by MALDITOF. It was found to have an optimal pH of 5.0 and a temperature of 60?C. Using pnitrophenylNAcetylβDglycosaminide apparent Km and Vmax values were of 2.72 mM and 0.572 nmol/mg/min, respectively. The enzyme was inhibited by mercury chloride (HgCl2) and sodium dodecil sulfate (SDS) / Este trabalho mostra a purifica??o de 2232 vezes de uma βNAcetilhexosaminidase obtida a partir dos extratos hep?ticos do mam?fero marinho Sotalia fluviatilis com recupera??o final de 8,4%. Passos seq?enciais foram utilizados para a purifica??o enzim?tica: fracionamento com sulfato de am?nio e as cromatografias de Biogel A 1.5 m, Quitina, DEAESepharose e Hidroxiapatita. A massa molecular prot?ica foi estimada em 10 kDa usando SDSPAGE e confirmada por MALDITOF. Foi encontrado como pH e temperatura ?timos, 5,0 e 60?C, respectivamente. Os valores de Km e Vm?x aparentes foram 2,72 mM e 0,572 nmol/mg/min, sendo utilizado o pnitrofenilNAcetilβDglicosamin?deo como substrato. A enzima foi inibida pelo cloreto de merc?rio (HgCl2) e dodecil sulfato de s?dio (SDS)
66

Unravelling The Regulators Of Translation And Replication Of Hepatitis C Virus

Ray, Upasana January 2011 (has links) (PDF)
Unravelling the regulators of translation and replication of Hepatitis C virus Hepatitis C virus (HCV) is a positive sense, single stranded RNA virus belonging to the genus Hepacivirus and the family Flaviviridae. It infects human liver cells predominantly. Although, the treatment with α interferon and ribavirin can control HCV in some cases, they fail to achieve sustained virological response in others, thus emphasizing the need of novel therapeutic targets. The viral genome is 9.6 kb long consisting of a 5’ untranslated region (5’UTR), a long open reading frame (ORF) that encodes the viral proteins and the 3’ untranslated region (3’UTR). The 5’UTR contains a cis acting element, the internal ribosome entry site (IRES) that mediates the internal initiation of translation. The HCV 5’UTR is highly structured and consists of four major stem-loops (SL) and a pseudoknot structure. HCV proteins are synthesized by the IRES mediated translation of the viral RNA, which is the initial obligatory step after infection. The viral proteins are synthesized in the form of a long continuous chain of proteins, the polyprotein, which is then processed by the host cell and the viral proteases. Once viral proteins are synthesized sufficiently, the viral RNA is replicated. However the mechanism of switch from translation to viral RNA replication is not well understood. Several host proteins as well as the viral proteins help in the completion of various steps in the HCV life cycle. In this thesis, the role of two such factors in HCV RNA translation and replication has been characterized and exploited to develop anti-HCV peptides. The HCV proteins are categorized into two major classes based on the functions broadly: the non structural and the structural proteins. HCV NS3 protein (one of the viral non structural proteins) plays a central role in viral polyprotein processing and RNA replication. In the first part of the thesis, it has been demonstrated that the NS3 protease (NS3pro) domain alone can specifically bind to HCV-IRES RNA, predominantly in the SLIV region. The cleavage activity of the NS3 protease domain is reduced upon HCV-RNA binding owing to the participation of the catalytic triad residue (Ser 139) in this RNA protein interaction. More importantly, NS3pro binding to the SLIV region hinders the interaction of La protein, a cellular IRES-trans acting factor required for HCV IRES-mediated translation, thus resulting in the inhibition of HCV-IRES activity. Moreover excess La protein could rescue the inhibition caused by the NS3 protease. Additionally it was observed that the NS3 protease and human La protein could out-compete each other for binding to the HCV SL IV region indicating that these two proteins share the binding region near the initiator AUG which was further confirmed using RNase T1 foot printing assay. Although an over expression of NS3pro as well as the full length NS3 protein decreased the level of HCV IRES mediated translation in the cells, replication of HCV RNA was enhanced significantly. These observations suggested that the NS3pro binding to HCV IRES reduces translation in favour of RNA replication. The competition between the host factor (La) and the viral protein (NS3) for binding to HCV IRES might contribute in the regulation of the molecular switch from translation to replication of HCV. In the second part the interaction of NS3 protease and HCV IRES has been elucidated in detail and the insights obtained were used to target HCV RNA function. Computational approach was used to predict the putative amino acid residues within the protease that might be involved in the interaction with the HCV IRES. Based on the predictions a 30-mer peptide (NS3proC-30) was designed from the RNA binding region. This peptide retained the RNA binding ability and also inhibited IRES mediated translation. The NS3proC-30 peptide was further shortened to 15-mer length (NS3proC-C15) and demonstrated ex vivo its ability to inhibit translation as well as replication. Additionally, its activity was tested in vivo in a mice model by encapsulating the peptide in Sendai virus based virosome followed by preferential delivery in mice liver. This virosome derived from Sendai virus F protein has terminal galactose moiety that interacts with the asialoglycoprotein receptor on the hepatocytes leading to membrane fusion and release of contents inside the cell. Results suggested that this peptide can be used as a potent anti-HCV agent. It has been shown earlier from our laboratory, that La protein interacts with HCVIRES near initiator AUG at GCAC motif by its central RNA recognition motif, the RRM2 (residues 112-184). A 24 mer peptide derived from this RRM2 of La (LaR2C) retained RNA binding ability and inhibited HCV RNA translation. NMR spectroscopy of the HCV-IRES bound peptide complex revealed putative contact points, mutations at which showed reduced RNA binding and translation inhibitory activity. The residues responsible for RNA recognition were found to form a turn in the RRM2 structure. A 7-mer peptide (LaR2C-N7) comprising this turn showed significant translation inhibitory activity. The bound structure of the peptide inferred from transferred NOE (Nuclear Overhauser Effect) experiments suggested it to be a βturn. Interestingly, addition of hexa-arginine tag enabled the peptide to enter Huh7 cells and showed inhibition HCV-IRES function. More importantly, the peptide significantly inhibited replication of HCVRNA. Smaller forms of this peptide however failed to show significant inhibition of HCV RNA functions suggesting that the 7-mer peptide as the smallest but efficient anti-HCV peptide from the second RNA recognition motif of the human La protein. Further, combinations of the LaR2C-N7 and NS3proC-C15 peptide showed better inhibitory activity. Both the peptides were found to be interacting at similar regions of SLIV around the initiator AUG. The two approaches have the potential to block the HCV RNA-directed translation by targeting the host factor and a viral protein, and thus can be tried in combination as a multi drug approach to combat HCV infection. Taken together, the study reveals important insights about the complex regulation of the HCV RNA translation and replication by the host protein La and viral NS3 protein. The interaction of the NS3 protein with the SLIV of HCV IRES leads to dislodging of the human La protein to inhibit the translation in favour of the RNA replication. These two proteins thus act as the regulators of the translation and the replication of viral RNA. The peptides derived from these regulators in turn regulate the functions of these proteins and inhibit the HCV RNA functions.
67

Application of machine learning for energy reconstruction in the ATLAS liquid argon calorimeter

Polson, Lucas A. 06 July 2021 (has links)
The beam intensity of the Large Hadron Collider will be significantly increased during the Phase-II long shut down of 2024-2026. Signal processing techniques that are used to extract the energy of detected particles in ATLAS will suffer a significant loss in performance under these conditions. This study compares the presently used optimal filter technique to alternative machine learning algorithms for signal processing. The machine learning algorithms are shown to outperform the optimal filter in many relevant metrics for energy extraction. This thesis also explores the implementation of machine learning algorithms on ATLAS hardware. / Graduate
68

Laser Spectroscopy Studying Organic and Inorganic Intermediates in The Atmospheric Oxidation Process

Chen, Ming-Wei 20 October 2011 (has links)
No description available.
69

Distributed Computing Solutions for High Energy Physics Interactive Data Analysis

Padulano, Vincenzo Eduardo 04 May 2023 (has links)
[ES] La investigación científica en Física de Altas Energías (HEP) se caracteriza por desafíos computacionales complejos, que durante décadas tuvieron que ser abordados mediante la investigación de técnicas informáticas en paralelo a los avances en la comprensión de la física. Uno de los principales actores en el campo, el CERN, alberga tanto el Gran Colisionador de Hadrones (LHC) como miles de investigadores cada año que se dedican a recopilar y procesar las enormes cantidades de datos generados por el acelerador de partículas. Históricamente, esto ha proporcionado un terreno fértil para las técnicas de computación distribuida, conduciendo a la creación de Worldwide LHC Computing Grid (WLCG), una red global de gran potencia informática para todos los experimentos LHC y del campo HEP. Los datos generados por el LHC hasta ahora ya han planteado desafíos para la informática y el almacenamiento. Esto solo aumentará con futuras actualizaciones de hardware del acelerador, un escenario que requerirá grandes cantidades de recursos coordinados para ejecutar los análisis HEP. La estrategia principal para cálculos tan complejos es, hasta el día de hoy, enviar solicitudes a sistemas de colas por lotes conectados a la red. Esto tiene dos grandes desventajas para el usuario: falta de interactividad y tiempos de espera desconocidos. En años más recientes, otros campos de la investigación y la industria han desarrollado nuevas técnicas para abordar la tarea de analizar las cantidades cada vez mayores de datos generados por humanos (una tendencia comúnmente mencionada como "Big Data"). Por lo tanto, han surgido nuevas interfaces y modelos de programación que muestran la interactividad como una característica clave y permiten el uso de grandes recursos informáticos. A la luz del escenario descrito anteriormente, esta tesis tiene como objetivo aprovechar las herramientas y arquitecturas de la industria de vanguardia para acelerar los flujos de trabajo de análisis en HEP, y proporcionar una interfaz de programación que permite la paralelización automática, tanto en una sola máquina como en un conjunto de recursos distribuidos. Se centra en los modelos de programación modernos y en cómo hacer el mejor uso de los recursos de hardware disponibles al tiempo que proporciona una experiencia de usuario perfecta. La tesis también propone una solución informática distribuida moderna para el análisis de datos HEP, haciendo uso del software llamado ROOT y, en particular, de su capa de análisis de datos llamada RDataFrame. Se exploran algunas áreas clave de investigación en torno a esta propuesta. Desde el punto de vista del usuario, esto se detalla en forma de una nueva interfaz que puede ejecutarse en una computadora portátil o en miles de nodos informáticos, sin cambios en la aplicación del usuario. Este desarrollo abre la puerta a la explotación de recursos distribuidos a través de motores de ejecución estándar de la industria que pueden escalar a múltiples nodos en clústeres HPC o HTC, o incluso en ofertas serverless de nubes comerciales. Dado que el análisis de datos en este campo a menudo está limitado por E/S, se necesita comprender cuáles son los posibles mecanismos de almacenamiento en caché. En este sentido, se investigó un sistema de almacenamiento novedoso basado en la tecnología de almacenamiento de objetos como objetivo para el caché. En conclusión, el futuro del análisis de datos en HEP presenta desafíos desde varias perspectivas, desde la explotación de recursos informáticos y de almacenamiento distribuidos hasta el diseño de interfaces de usuario ergonómicas. Los marcos de software deben apuntar a la eficiencia y la facilidad de uso, desvinculando la definición de los cálculos físicos de los detalles de implementación de su ejecución. Esta tesis se enmarca en el esfuerzo colectivo de la comunidad HEP hacia estos objetivos, definiendo problemas y posibles soluciones que pueden ser adoptadas por futuros investigadores. / [CA] La investigació científica a Física d'Altes Energies (HEP) es caracteritza per desafiaments computacionals complexos, que durant dècades van haver de ser abordats mitjançant la investigació de tècniques informàtiques en paral·lel als avenços en la comprensió de la física. Un dels principals actors al camp, el CERN, acull tant el Gran Col·lisionador d'Hadrons (LHC) com milers d'investigadors cada any que es dediquen a recopilar i processar les enormes quantitats de dades generades per l'accelerador de partícules. Històricament, això ha proporcionat un terreny fèrtil per a les tècniques de computació distribuïda, conduint a la creació del Worldwide LHC Computing Grid (WLCG), una xarxa global de gran potència informàtica per a tots els experiments LHC i del camp HEP. Les dades generades per l'LHC fins ara ja han plantejat desafiaments per a la informàtica i l'emmagatzematge. Això només augmentarà amb futures actualitzacions de maquinari de l'accelerador, un escenari que requerirà grans quantitats de recursos coordinats per executar les anàlisis HEP. L'estratègia principal per a càlculs tan complexos és, fins avui, enviar sol·licituds a sistemes de cues per lots connectats a la xarxa. Això té dos grans desavantatges per a l'usuari: manca d'interactivitat i temps de espera desconeguts. En anys més recents, altres camps de la recerca i la indústria han desenvolupat noves tècniques per abordar la tasca d'analitzar les quantitats cada vegada més grans de dades generades per humans (una tendència comunament esmentada com a "Big Data"). Per tant, han sorgit noves interfícies i models de programació que mostren la interactivitat com a característica clau i permeten l'ús de grans recursos informàtics. A la llum de l'escenari descrit anteriorment, aquesta tesi té com a objectiu aprofitar les eines i les arquitectures de la indústria d'avantguarda per accelerar els fluxos de treball d'anàlisi a HEP, i proporcionar una interfície de programació que permet la paral·lelització automàtica, tant en una sola màquina com en un conjunt de recursos distribuïts. Se centra en els models de programació moderns i com fer el millor ús dels recursos de maquinari disponibles alhora que proporciona una experiència d'usuari perfecta. La tesi també proposa una solució informàtica distribuïda moderna per a l'anàlisi de dades HEP, fent ús del programari anomenat ROOT i, en particular, de la seva capa d'anàlisi de dades anomenada RDataFrame. S'exploren algunes àrees clau de recerca sobre aquesta proposta. Des del punt de vista de l'usuari, això es detalla en forma duna nova interfície que es pot executar en un ordinador portàtil o en milers de nodes informàtics, sense canvis en l'aplicació de l'usuari. Aquest desenvolupament obre la porta a l'explotació de recursos distribuïts a través de motors d'execució estàndard de la indústria que poden escalar a múltiples nodes en clústers HPC o HTC, o fins i tot en ofertes serverless de núvols comercials. Atès que sovint l'anàlisi de dades en aquest camp està limitada per E/S, cal comprendre quins són els possibles mecanismes d'emmagatzematge en memòria cau. En aquest sentit, es va investigar un nou sistema d'emmagatzematge basat en la tecnologia d'emmagatzematge d'objectes com a objectiu per a la memòria cau. En conclusió, el futur de l'anàlisi de dades a HEP presenta reptes des de diverses perspectives, des de l'explotació de recursos informàtics i d'emmagatzematge distribuïts fins al disseny d'interfícies d'usuari ergonòmiques. Els marcs de programari han d'apuntar a l'eficiència i la facilitat d'ús, desvinculant la definició dels càlculs físics dels detalls d'implementació de la seva execució. Aquesta tesi s'emmarca en l'esforç col·lectiu de la comunitat HEP cap a aquests objectius, definint problemes i possibles solucions que poden ser adoptades per futurs investigadors. / [EN] The scientific research in High Energy Physics (HEP) is characterised by complex computational challenges, which over the decades had to be addressed by researching computing techniques in parallel to the advances in understanding physics. One of the main actors in the field, CERN, hosts both the Large Hadron Collider (LHC) and thousands of researchers yearly who are devoted to collecting and processing the huge amounts of data generated by the particle accelerator. This has historically provided a fertile ground for distributed computing techniques, which led to the creation of the Worldwide LHC Computing Grid (WLCG), a global network providing large computing power for all the experiments revolving around the LHC and the HEP field. Data generated by the LHC so far has already posed challenges for computing and storage. This is only going to increase with future hardware updates of the accelerator, which will bring a scenario that will require large amounts of coordinated resources to run the workflows of HEP analyses. The main strategy for such complex computations is, still to this day, submitting applications to batch queueing systems connected to the grid and wait for the final result to arrive. This has two great disadvantages from the user's perspective: no interactivity and unknown waiting times. In more recent years, other fields of research and industry have developed new techniques to address the task of analysing the ever increasing large amounts of human-generated data (a trend commonly mentioned as "Big Data"). Thus, new programming interfaces and models have arised that most often showcase interactivity as one key feature while also allowing the usage of large computational resources. In light of the scenario described above, this thesis aims at leveraging cutting-edge industry tools and architectures to speed up analysis workflows in High Energy Physics, while providing a programming interface that enables automatic parallelisation, both on a single machine and on a set of distributed resources. It focuses on modern programming models and on how to make best use of the available hardware resources while providing a seamless user experience. The thesis also proposes a modern distributed computing solution to the HEP data analysis, making use of the established software framework called ROOT and in particular of its data analysis layer implemented with the RDataFrame class. A few key research areas that revolved around this proposal are explored. From the user's point of view, this is detailed in the form of a new interface to data analysis that is able to run on a laptop or on thousands of computing nodes, with no change in the user application. This development opens the door to exploiting distributed resources via industry standard execution engines that can scale to multiple nodes on HPC or HTC clusters, or even on serverless offerings of commercial clouds. Since data analysis in this field is often I/O bound, a good comprehension of what are the possible caching mechanisms is needed. In this regard, a novel storage system based on object store technology was researched as a target for caching. In conclusion, the future of data analysis in High Energy Physics presents challenges from various perspectives, from the exploitation of distributed computing and storage resources to the design of ergonomic user interfaces. Software frameworks should aim at efficiency and ease of use, decoupling as much as possible the definition of the physics computations from the implementation details of their execution. This thesis is framed in the collective effort of the HEP community towards these goals, defining problems and possible solutions that can be adopted by future researchers. / Padulano, VE. (2023). Distributed Computing Solutions for High Energy Physics Interactive Data Analysis [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/193104
70

Einfluss von freien Fettsäuren und Triglyceriden auf die Expression von proinflammatorischen Mediatoren und Adhäsionsmolekülen in Hepatozyten und Kupffer-Zellen (der Ratte) / Effect of free fatty acids and triglycerides on the expression of proinflammatory mediators and adhesion molecules in hepatocytes and Kupffer cells (of the rat)

Demuth, Julia Elisabeth 01 December 2009 (has links)
No description available.

Page generated in 0.0284 seconds