1 |
Genome sequencing of Leptolyngbya Heron Island, 2Å crystal structure of phycoerythrin and spectroscopic investigation of chromatic acclimationJanuary 2014 (has links)
abstract: Photosynthesis is the primary source of energy for most living organisms. Light harvesting complexes (LHC) play a vital role in harvesting sunlight and passing it on to the protein complexes of the electron transfer chain which create the electrochemical potential across the membrane which drives ATP synthesis. phycobilisomes (PBS) are the most important LHCs in cyanobacteria. PBS is a complex of three light harvesting proteins: phycoerythrin (PE), phycocyanin (PC) and allophycocyanin (APC). This work has been done on a newly discovered cyanobacterium called Leptolyngbya Heron Island (L.HI). This study has three important goals: 1) Sequencing, assembly and annotation of the L.HI genome - Since this is a newly discovered cyanobacterium, its genome was not previously elucidated. Illumina sequencing, a type of next generation sequencing (NGS) technology was employed to sequence the genome. Unfortunately, the natural isolate contained other contaminating and potentially symbiotic bacterial populations. A novel bioinformatics strategy for separating DNA from contaminating bacterial populations from that of L.HI was devised which involves a combination of tetranucleotide frequency, %(G+C), BLAST analysis and gene annotation. 2) Structural elucidation of phycoerythrin - Phycoerythrin is the most important protein in the PBS assembly because it is one of the few light harvesting proteins which absorbs green light. The protein was crystallized and its structure solved to a resolution of 2Å. This protein contains two chemically distinct types of chromophores: phycourobilin and phycoerythrobilin. Energy transfer calculations indicate that there is unidirectional flow of energy from phycourobilin to phycoerythrobilin. Energy transfer time constants using Forster energy transfer theory have been found to be consistent with experimental data available in literature. 3) Effect of chromatic acclimation on photosystems - Chromatic acclimation is a phenomenon in which an organism modulates the ratio of PE/PC with change in light conditions. Our investigation in case of L.HI has revealed that the PE is expressed more in green light than PC in red light. This leads to unequal harvesting of light in these two states. Therefore, photosystem II expression is increased in red-light acclimatized cells coupled with an increase in number of PBS. / Dissertation/Thesis / Ph.D. Chemistry 2014
|
2 |
Utilização de ferramentas tecnológicas para análise musical: a ladainha de nossa senhora de Faustino Xavier do Prado na visão de um descritor / -Dias, Robson 27 November 2015 (has links)
O Presente trabalho é o estudo das funcionalidades de um descritor e sua aplicabilidade em análise musical tendo como ponto de partida a obra a Ladainha de Nossa Senhora de Faustino Xavier do Prado. Utilizaremos os intervalos melódicos existentes na Ladainha como padrões para verificar similaridades nos contextos; Brasil, Itália e Portugal. Utilizaremos como ferramenta de busca de padrões o software MelodicMatch e desenvolveremos um software para tratar os resultados obtidos organizando-os em uma proposta metodológica que estará descrita na própria aplicação. / The present work is the study of the features of a descriptor and its use in musical analysis taking as its starting point the work of the Ladainha de Nossa Senhora - Faustino Xavier del Prado. We will use existing melodic intervals in the Ladainha as standards for checking similarities in the contexts; Brazil, Italy and Portugal. We will use as standards search tool MelodicMatch the software and develop software to handle the results organizing them into a methodology that will be described in the application itself.
|
3 |
Noções de programação estruturada em Python no ensino de Física: um caminho para o ensino médio por meio da cultura lúdica / Notions of structured programming with Python in the teaching of Physics: a path to high school through ludic cultureParizotto, Giovanna Moreno 14 September 2017 (has links)
Submitted by Luciana Ferreira (lucgeral@gmail.com) on 2017-10-16T11:41:10Z
No. of bitstreams: 2
Dissertação - Giovanna Moreno Parizotto - 2017.pdf: 2790011 bytes, checksum: 84424125a05214d9b7536300c92cae6b (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-10-16T11:42:54Z (GMT) No. of bitstreams: 2
Dissertação - Giovanna Moreno Parizotto - 2017.pdf: 2790011 bytes, checksum: 84424125a05214d9b7536300c92cae6b (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2017-10-16T11:42:54Z (GMT). No. of bitstreams: 2
Dissertação - Giovanna Moreno Parizotto - 2017.pdf: 2790011 bytes, checksum: 84424125a05214d9b7536300c92cae6b (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Previous issue date: 2017-09-14 / In this qualitative research with case study elements, we discuss why the use of notions of structured programming in Python language constituted as a manipulation of the play culture for the teaching of Physics in the first year of the High School, at night. Manipulation in this case, refers us to aspects related to notions of game and characteristics related to the game, recognizing the game as a place of emergency of the ludic culture. This theme is related to the teacher training of the researcher, who seeks to enrich the student's playful culture in which she has greater didactic difficulty. In the course of the research, the researchers find several characteristics of the games during the interventions. This process is discussed as to the characteristics of the philosophical game proposed by Brougère (1998) and also to the typical behaviors of them, considered as primary impulses by Caillois (1990), related to the term game. We also relate corruption characteristics of these primary impulses to the lubricant ludic term. / Nesta pesquisa qualitativa com elementos de estudo de caso discutimos por que o uso de noções de programação estruturada em linguagem Python constituiu-se como uma manipulação da cultura lúdica para o ensino de Física no primeiro ano do Ensino Médio noturno. Manipulação neste caso, remete-nos a aspectos ligados a noções de jogo e características relacionadas ao jogos, reconhecendo o jogo como lugar de emergência da cultura lúdica. Tal temática está de encontro a formação docente da pesquisadora, que busca enriquecer a cultura lúdica do alunado no qual possui maior dificuldade didática. No decorrer da pesquisa, os pesquisadores encontram várias características dos jogos durante as intervenções. Tal processo é discutido quanto as características do jogo filosófico propostas por Brougère (1998) e também aos comportamentos típicos dos mesmos, tidos como impulsões primárias por Caillois (1990), relacionado ao termo jogo. Relacionamos ainda características de corrupções destas impulsões primárias ao termo lúdico lúbrico.
|
4 |
Utilização de ferramentas tecnológicas para análise musical: a ladainha de nossa senhora de Faustino Xavier do Prado na visão de um descritor / -Robson Dias 27 November 2015 (has links)
O Presente trabalho é o estudo das funcionalidades de um descritor e sua aplicabilidade em análise musical tendo como ponto de partida a obra a Ladainha de Nossa Senhora de Faustino Xavier do Prado. Utilizaremos os intervalos melódicos existentes na Ladainha como padrões para verificar similaridades nos contextos; Brasil, Itália e Portugal. Utilizaremos como ferramenta de busca de padrões o software MelodicMatch e desenvolveremos um software para tratar os resultados obtidos organizando-os em uma proposta metodológica que estará descrita na própria aplicação. / The present work is the study of the features of a descriptor and its use in musical analysis taking as its starting point the work of the Ladainha de Nossa Senhora - Faustino Xavier del Prado. We will use existing melodic intervals in the Ladainha as standards for checking similarities in the contexts; Brazil, Italy and Portugal. We will use as standards search tool MelodicMatch the software and develop software to handle the results organizing them into a methodology that will be described in the application itself.
|
5 |
Développements HPC pour une nouvelle méthode de docking inverse : applications aux protéines matricielles. / HPC developpements for a new inverse docking method and matrix proteins applications.Vasseur, Romain 29 January 2015 (has links)
Ce travail de thèse consiste au développement méthodologique et logiciel d'une méthode de docking moléculaire dite inverse. Cette méthode propose à travers le programme AMIDE — Automatic Inverse Docking Engine — de distribuer un grand nombres de simulations d'amarrage moléculaire sur des architectures HPC (clusters de calcul) avec les applications AutoDock 4.2 et AutoDock Vina. Le principe de cette méthode consiste à tester de petites molécules sur un ensemble de protéines cibles potentielles. Les paramètres optimaux ont été définis à partir d'une étude pilote et le protocole a été validé sur des ligands et peptides liants les protéines MMPs et EBP de la matrice extracellulaire. Cette méthode montre qu'elle permet d‘améliorer la recherche conformationnelle lors du calcul de docking sur des structures expérimentales par rapport à des protocoles existants (blind docking). Il est montré que le programme AMIDE permet de discriminer des sites de fixation privilégiés lors d'expériences de criblage inverse de protéines de manière plus performante que par blind docking. Ces résultats sont obtenus par la mise en place de méthodes de partitionnement de l'espace de recherche qui permettent également à travers un système de distribution hybride de déployer un ensemble de tâches indépendantes pour un traitement autorisant le passage d'échelle. / This work is a methodological and software development of so-called inverse molecular docking method. This method offers through an in house program AMIDE — Automatic Reverse Docking Engine — to distribute large numbers of molecular docking simulations on HPC architectures (com- puting clusters) with AutoDock 4.2 and AutoDock Vina applications. The principle of this method is to test small molecules on a set of potential target proteins. The program optimum parameters were defined from a pilot study and the protocol was validated on ligands and peptides binding MMPs and EBP extracellular matrix proteins. This method improves the conformational search in docking computation on experimental structures compared to existing protocols (blind docking). It is shown that the AMIDE program is more efficient to discriminate preferred binding sites in inverse proteins screening experiments than blind docking. These results are obtained by the implemen- tation of methods for partitioning the search space that also allow through a hybrid distribution system to deploy a set of independent embarassingly parallel tasks perfectly scalable.
|
6 |
Data-driven decisions in SportsGarcia de Baquedano, Gabriel January 2023 (has links)
In recent years, many sectors such as insurance, banking, retail, etc. have adopted Big Data architectures to boost their business activities. Such tools not only suppose a greater profit for thesecompanies but also allow them to gain a better understanding of their customers and their needs.These techniques are rapidly being adopted, this also being the case of sports and team sportsfor tasks such as injury prediction and prevention, performance improvement, or fan engagement.The aim of this project is to analyze the implications of data-driven decisions focusing on theiractual and future use in sports. Finally, a player scouting and team tailoring application would bedesigned and deployed to help the technical staff decision-making process which will also supposea budget optimization. For doing so, “Python” programming language and “Rapidminer” will beused, implementing “fuzzy logic” techniques for player scouting and “knapsack problem” algorithms for budget optimization plus an additional price prediction algorithm. The outcome wouldbe the application which given certain player needs (e.g., a midfielder with a high pass accuracyand a high ball recovery and a goalkeeper with a big number of saves and many minutes played)and the available budget will suggest the best possible combination of players given the availablebudget and the algorithm capable of predicting prices. This project also intends to study how thisapplication could be deployed in a real case situation by estimating the work team and budget todo so.
|
7 |
A GIS approach for improving transportation and mobility in Iqaluit, Nunavut TerritoryCopithorne, Dana 22 December 2011 (has links)
Planning for transportation within northern Canadian communities presents
unique challenges, but new research tools offer opportunities for testing potentially
innovative solutions that might help improve mobility within these communities. In
particular, problem solving has been enriched in recent years by using the spatial
modeling methods offered by Geographical Information Systems (GIS). This thesis first
reviews various GIS methods before applying one method – the ‘Route Utility Theory’ –
to a newly-developed set of metrics for determining the cost of alternate modes of intracommunity
transportation. This set of metrics is applied to a data set that represents the
trips or journeys made by non-car users in Iqaluit, the capital city of Nunavut Territory.
GIS data on roads, walking trails, land contours, and public and residential
neighbourhoods are analyzed. The results facilitate comparisons between road options
and trail options for improving the movement of people within Iqaluit. Five bus routes
were then custom designed and compared using the study’s metrics. The study found
that increasing bus and trail options within Iqaluit would provide more efficient options
for non-car users. It is argued that the study’s metrics can be adapted for application in
other northern communities, and possibly in other isolated and rural communities in
different world situations. / Graduate
|
8 |
On the Generalized Finite Element Method in nonlinear solid mechanics analyses / Sobre o método dos Elementos Finitos Generalizados em análises da mecânica dos sólidos não-linearPiedade Neto, Dorival 29 November 2013 (has links)
The Generalized Finite Element Method (GFEM) is a numerical method based on the Partition of Unity (PU) concept and inspired on both the Partition of Unity Method (PUM) and the hp-Cloud method. According to the GFEM, the PU is provided by first-degree Lagragian interpolation functions, defined over a mesh of elements similar to the Finite Element Method (FEM) meshes. In fact, the GFEM can be considered an extension of the FEM to which enrichment functions can be applied in specific regions of the problem domain to improve the solution. This technique has been successfully employed to solve problems presenting discontinuities and singularities, like those that arise in Fracture Mechanics. However, most publications on the method are related to linear analyses. The present thesis is a contribution to the few studies of nonlinear analyses of Solid Mechanics by means of the GFEM. One of its main topics is the derivation of a segment-to-segment generalized contact element based on the mortar method. Material and kinematic nonlinear phenomena are also considered in the numerical models. An Object-Oriented design was developed for the implementation of a GFEM nonlinear analyses framework written in Python programming language. The results validated the formulation and demonstrate the gains and possible drawbacks observed for the GFEM nonlinear approach. / O Método dos Elementos Finitos Generalizados (MEFG) é um método numérico baseado no conceito de partição da unidade (PU) e inspirado no Método da Partição da Unidade (MPU) e o método das Nuvens-hp. De acordo com o MEFG, a PU é obtida por meio de funções de interpolação Lagragianas de primeiro grau, definidas sobre uma rede de elementos similar àquela do Método dos Elementos Finitos (MEF). De fato, o MEFG pode ser considerado uma extensão do MEF para a qual se pode aplicar enriquecimentos em regiões específicas do domínio, buscando melhorias na solução. Esta técnica já foi aplicada com sucesso em problemas com descontinuidades e singularidades, como os originários da Mecânica da Fratura. Apesar disso, a maioria das publicações sobre o método está relacionada a análises lineares. A presente tese é uma contribuição aos poucos estudos relacionados a análises não-lineares de Mecânica dos Sólidos por meio do MEFG. Um de seus principais tópicos é o desenvolvimento de um elemento de contato generalizado do tipo segmento a segmento baseado no método mortar. Fenômenos não lineares devidos ao material e à cinemática também são considerados nos modelos numéricos. Um projeto de orientação a objetos para a implementação de uma plataforma de análises não-lineares foi desenvolvido, escrito em linguagem de programação Python. Os resultados validam a formulação e demonstram os ganhos e possíveis desvantagens da abordagem a problemas não lineares por meio do MEFG.
|
9 |
On the Generalized Finite Element Method in nonlinear solid mechanics analyses / Sobre o método dos Elementos Finitos Generalizados em análises da mecânica dos sólidos não-linearDorival Piedade Neto 29 November 2013 (has links)
The Generalized Finite Element Method (GFEM) is a numerical method based on the Partition of Unity (PU) concept and inspired on both the Partition of Unity Method (PUM) and the hp-Cloud method. According to the GFEM, the PU is provided by first-degree Lagragian interpolation functions, defined over a mesh of elements similar to the Finite Element Method (FEM) meshes. In fact, the GFEM can be considered an extension of the FEM to which enrichment functions can be applied in specific regions of the problem domain to improve the solution. This technique has been successfully employed to solve problems presenting discontinuities and singularities, like those that arise in Fracture Mechanics. However, most publications on the method are related to linear analyses. The present thesis is a contribution to the few studies of nonlinear analyses of Solid Mechanics by means of the GFEM. One of its main topics is the derivation of a segment-to-segment generalized contact element based on the mortar method. Material and kinematic nonlinear phenomena are also considered in the numerical models. An Object-Oriented design was developed for the implementation of a GFEM nonlinear analyses framework written in Python programming language. The results validated the formulation and demonstrate the gains and possible drawbacks observed for the GFEM nonlinear approach. / O Método dos Elementos Finitos Generalizados (MEFG) é um método numérico baseado no conceito de partição da unidade (PU) e inspirado no Método da Partição da Unidade (MPU) e o método das Nuvens-hp. De acordo com o MEFG, a PU é obtida por meio de funções de interpolação Lagragianas de primeiro grau, definidas sobre uma rede de elementos similar àquela do Método dos Elementos Finitos (MEF). De fato, o MEFG pode ser considerado uma extensão do MEF para a qual se pode aplicar enriquecimentos em regiões específicas do domínio, buscando melhorias na solução. Esta técnica já foi aplicada com sucesso em problemas com descontinuidades e singularidades, como os originários da Mecânica da Fratura. Apesar disso, a maioria das publicações sobre o método está relacionada a análises lineares. A presente tese é uma contribuição aos poucos estudos relacionados a análises não-lineares de Mecânica dos Sólidos por meio do MEFG. Um de seus principais tópicos é o desenvolvimento de um elemento de contato generalizado do tipo segmento a segmento baseado no método mortar. Fenômenos não lineares devidos ao material e à cinemática também são considerados nos modelos numéricos. Um projeto de orientação a objetos para a implementação de uma plataforma de análises não-lineares foi desenvolvido, escrito em linguagem de programação Python. Os resultados validam a formulação e demonstram os ganhos e possíveis desvantagens da abordagem a problemas não lineares por meio do MEFG.
|
10 |
Automatic Probing System for PCB : Analysis of an automatic probing system for design verification of printed circuit boardsAalto, Alve, Jafari, Ali January 2015 (has links)
The purpose of this thesis is to conduct an analysis of whether the printed circuit boards from Ericsson can be tested using an automatic probing system or what changes in the design are required, to be a viable solution. The main instrument used for analyzing the printed circuit board was an oscilloscope. The oscilloscope was used to get the raw data for plotting the difference between the theoretical and actual signals. Connected to the oscilloscope was a 600A-AT probe from LeCroy. The programs used for interpreting the raw data extracted from the oscilloscope included Python, Matlab and Excel. For simulations on how an extra via in the signal path would affect the end results we used HFSS and ADS. The results were extracted into different Excel sheets to get an easier overview of the results. The results showed that the design of a board must almost become completely rebuilt for the changes, and it is therefore better to implement in a new circuit board rather than in an already existing one. Some of the components have to either be smaller or placed on one side of the board, where they cannot be in the way of the probe. The size of the board will become larger since the rules of via placements will be limited compared to before. The most time demanding part was the simulations of the extra via in the signal path, and the results showed that if a single-ended signal is below two gigahertz the placing of the via does not make a big difference, but if the signal has a higher frequency the placement is mostly dependent on the type of the signal. The optimal placement is generally around four millimeters away from the receiving end. / Målet med detta examensarbete är att göra en analys av huruvida Ericssons kretskort kan testas med hjälp av ett automatiskt probe system eller om det kräver stora förändringar i designdelen av kretskorten och om, vad för förändringar det i sådant fall kan vara. Till hjälp att analysera kretskorten har vi haft oscilloskop för att få ut rådata om skillnaderna mellan de teoretiska och verkliga signalerna. För att kunna tyda oscilloskopets samplade signaler har olika programmeringsspråk som Python, Matlab samt Excel använts. En extra via i signalens väg har även simulerats i HFSS och ADS med olika sorts probar för att se hur signalens beteende påverkas. Resultaten extraherades sedan in i olika Excel ark för att få en lätt överskådlig bild av resultaten. Resultatet vi fick visade att utformningen av ett kretskort med ändringarna skulle vara lättare att göra med en ny design istället för en redan existerande då större delar av kortet skulle behöva göras om. Vissa stora komponenter behöver antingen göras om, hitta mindre men likvärdiga eller sättas på ena sidan av kortet där de inte är i vägen för proben. Kretskorten som kommer använda flygande probesystem kommer antagligen bli lite större då viornas placering är mer begränsade än tidigare. Det mest tidskrävande arbetet var att simulera olika placeringar av en extra via i signalens väg. Detta visade att på en single ended signal under två gigahertz så gör det ingen större skillnad vart i signalens väg som den extra vian placeras. Då en högre frekvens används så är själva signalens karaktär det viktigaste än placeringen av en via, men om man inte vet den exakta karaktären så är fyra millimeter bort från mottagarens sida att rekommendera då närmare placering av viorna gör att signalerna börjar störa varandra.
|
Page generated in 0.0765 seconds