251 |
Impacto da utilização da somatotropina bovina (bST) sobre a produção de leite e a avaliação genética de bovinos da raça Holandesa / Impact of bovine somatotropin (bST) on the milk yield and genetic evaluation of Holstein cattleRodrigues, Marcelo 18 February 2009 (has links)
O objetivo do presente trabalho foi estudar a influência do uso da somatotropina bovina (bST) sobre a produção de leite e a avaliação genética de bovinos da raça Holandesa. Para análise foram utilizados dados referentes a 474 touros e observações referentes a 3341 lactações de 1271 vacas, provenientes da Agropecuária Agrindus - S.A no estado de São Paulo no período de 1999 a 2003. Análise de variância (método dos quadrados mínimos) foi realizada pelo procedimento GLM do SAS® (2003), visando identificar o efeito da classe de aplicação do bST sobre a produção de leite aos 305 dias de lactação (PL305). Os valores genéticos preditos dos touros (PBV), componentes de variância e herdabilidade para a característica PL305 foram estimados utilizando-se um modelo animal a partir de duas análises; na primeira incluiu-se o efeito do bST como fixo e na segunda o referido efeito foi ignorado. Foram calculadas correlações de Spearman entre PBV dos touros para quatro conjuntos de touros avaliados: a) todos os touros; b) os melhores 20%; c) os melhores 10% e d) os melhores 5%. As médias da PL305 para as classes de bST foram 9175,11kg - sem bST, 9530,94kg - de 11 a 20 aplicações, 10150,57kg - de 21 a 30 aplicações e 11089,89 Kg de 31 a 59 aplicações. As herdabilidades foram respectivamente de 0,26±0,00 e 0,23±0,00 para as duas análises e as correlações entre os valores genéticos preditos para os conjuntos de touros a, b, c e d foram, respectivamente, 0,9484, 0,9829, 0,9752 e 0,8974. A análise de variância demonstrou as médias de PL305 aumentaram significativamente (P<0,0001), com o aumento do número de aplicações do bST. Os coeficientes de herdabilidade, embora relativamente baixos, indicam possível ganho genético, por meio de seleção, para produção de leite. As altas correlações de Spearman entre os valores genéticos dos touros, considerando-se ou não o uso do bST no modelo, indicam que o uso desta tecnologia não interfere na classificação dos touros avaliados geneticamente. / The aim of this study was to investigate the influence of the use of the bovine somatotropin (bST) on the milk production and the bovine genetic evaluation of Holstein cattle. Data regarding 474 bulls and observation concerning 3341 lactations of 1271 cows from the Agrindus Agriculture and Cattle Raising S.A. in the state of São Paulo, Brazil, between 1999 and 2003 were used for the analysis. The variance analysis (the minimum square method) was performed by the GLM procedure of the SAS® (2003) for identify the effects of the application class of the bST on the milk production at 305 days of lactation (PL305). For the bulls, the predicted breeding values (PBV), variance components and heritability for the PL305 characteristic were estimated using an animal model from two analyses; in the first, the bST was considered as fixed effect and in the second, the effect of bST was ignored. Spearman correlations between PBV for four sets of evaluated bulls were calculated: a) all bulls; b) the best 20%; c) the best 10% and d) the best 5%. The PL305 averages for the bST classes were 9175.11kg without bST; 9530.94kg from 11 to 20 applications; 10150.57kg from 21 to 30 applications; and 11089.89kg from 31 to 59 applications. The heritabilities were 0.26±0.00 and 0.23±0.00 respectively for both analyses and the correlations between the PBV for a, b, c, and d bull sets were, respectively, 0.9484, 0.9829, 0.9752 and 0.8974. The variance analysis demonstrated that the PL305 averages increased significantly (P<0.0001) with the increase of number of bST applications. The heritability coefficients, although relatively low, indicate a possible genetic gain, by selection, for milk production. The Spearman high correlations between the PBV of bulls, considering or not the bST use in the model, indicate that the use of this technology does not interfere in the genetically evaluated bulls classification.
|
252 |
Corrélations électroniques des acènes vers la limite de longue taille : étude par Monte Carlo quantique / Electronic correlations in the acenes toward the long-length limit : a Monte Carlo studyDupuy, Nicolas 29 April 2016 (has links)
Nous étudions les acènes avec le modèle de fonction d'onde électronique fortement corrélé Jastrow antisymmetrized geminal power (JAGP) par Monte Carlo quantique (QMC). Ces méthodes permettent d'optimiser les fonctions JAGP de façon variationnelle (minimisation énergétique), et d'accéder à l'énergie de leur niveau fondamental électronique si les nœuds de leur fonction d'onde sont bien définis. En modulant la liberté variationnelle des formes JAGP nous étudions leurs propriétés électroniques en fonction de la qualité de la fonction d'onde. Nous obtenons ainsi des résultats en faveur d'un caractère fortement résonant mais étalé sur plusieurs états, incompatible avec la présence précédemment supposée de couches ouvertes, et le constat de biais induits par un niveau trop faible de liberté variationnelle. Par relaxation structurale effectuée en QMC sur des fonctions de différentes qualité nous montrons que les géométries des acènes sont très couplées à la structure électronique. Nous pouvons envisager d'étendre cette étude aux hydrocarbures polycycliques aromatiques à l'allure de nanorubans de graphène d'épaisseur croissante afin d'étudier de possible corrélations entre sextets de Clar et l'évolution de leurs propriétés électroniques et spintroniques. / We study the family of acenes by means of quantum Monte Carlo methods (QMC) based on a Jastrow correlated antisymmetrized geminal power (JAGP) wave function. Those methods allows for JAGP optimization in a variational manner (energy minimisation) and for ground state energy evaluations when the wave function nodes are well defined. By tuning the variational freedom of JAGP wave functions we study their electronic properties as a function of the wave function quality. We thus obtain results in favour of a highly resonating character, but smeared on many states, incompatible with a previously supposed open shell character. The study also demonstrates that a too low variational freedom induces high bias in the electronic description. By QMC structural relaxation on wave functions of various quality we demonstrate that the acenes geometry is highly coupled to their electronic structure.We can consider extending this study to general polycyclic aromatic hydrocarbons similar to graphene nanoribbons of growing thickness to investigate possible correlations between Clar sextets et their electronic and spintronic properties.
|
253 |
Code analysis : Uncovering hidden problems in a codebase by commits to a version control system / Kod analys : Avslöja gömda problem i en kodbas med commits till ett versionhanteringssystemReijo, Ken, Kåhre, Martin January 2019 (has links)
Syfte – Startpunkten för den här studien var att identifiera effektiva och ineffektiva delar av en kodbas som kan leda till förbättringar av koden. Detta utfördes på förfrågan av ett företag. För att stödja begärandet utnyttjades en bok av författaren Adam Tornhill. Han diskuterar flera metoder (också kallade analysmetoder) som används för att lokalisera och analysera platser i kod där effektivitet och ineffektivitet föreligger. De använder alla diverse variabler för det här syftet. En webbapplikation utvecklades för att upptäcka den här varianten av kod med hjälp av dessa variabler. Det spekulerades att det kan saknas samband mellan variablerna som kan upptäckas. Detta skulle ge mer insikt i hur verksam och overksam kod kan urskiljas och evalueras, utöver den insikt Adam Tornhills metoder understöder. Det vetenskapliga syftet och bidraget var därför att särskilja potentiella korrelationer mellan analysmetoderna. En frågeställning härleddes från detta syfte: ● Finns det korrelation hos variabler bland befintliga analysmetoder och i sådant fall, vilka är det? Metod – För att svara på frågeställningen var en noggrannare granskning av variablerna nödvändig för att utvärdera vilka som hade potentiella relationer. Efter det hämtades kvantitativa data av de valda variablerna från 7 open source projekter. Data erhölls från git commit historik från 2 år tillbaka. Informationen presenteras i form av grafer som examinerades för mönster och kontext bland de analysmetoder som var i fokus. Statistiska formler (som Pearsons korrelationskoefficient) utnyttjades i syfte att beräkna exakt korrelation för variablerna. Signifikansnivå sattes på 0,001 och ett p-värde kalkylerades. För de projekt med p-värde mindre än signifikansnivå beräknades även ett median och medelvärde med deras olika korrelationskoefficienten. Resultat – I slutet påträffades två stycken variabler som undersöktes genom grafer för samband. Utredningen visade ett tydligt mönster som indikerar att när fler personer arbetar på en fil kommer också antalet logiska kopplingar att öka för korresponderande fil. För de olika korrelationsvärden visade det sig att 6 av de 7 projekten hade ett p-värde mindre än den satta signifikansnivån 0,001 som innebär att 6 koefficienter är mycket statistiskt signifikanta. Det var bara för 5 av de 6 projekten med godkänd signifikans som positiv korrelation uppmättes. Medelvärdet för de 6 projekten med p-värde mindre än signifikansnivån var 0.41 och medianvärdet 0.63 vilket indikerar en positiv korrelation mellan antal författare och logiska kopplingar Implikationer – I projekt där många personer jobbar på en fil borde försiktighetsåtgärder erfordras med avseende till logiska kopplingar. Vissa förhindrande medel kanske kan etableras för att vägleda andra att minska, eller i alla fall inte onödigt ackumulera, logiska kopplingar när åtskilliga personer inträder på en fil. Begränsningar – Bara två analysmetoder och deras två variabler undersöktes för korrelation och det kan finnas fler variabler som kan ha korrelation. De 7 projekten som det utvanns data från var alla från open source och därför kanske inte resultatet stämmer för closed source projekt. / Purpose – The starting point of this study was to locate efficient and inefficient code in a codebase to improve upon it, as a request from a company. To help with their request a book was used by author Adam Tornhill who has made several methods (also called analysis methods) for this purpose. They all use different variables that locate problems in the code. A web application was developed to use these variables. It was speculated that relationships between the variables may be missing which could improve the analysis methods and in turn uncover efficient and inefficient code better. The main scientific purpose and contribution was therefore to discover associations between the specific variables presented by Adam Tornhill. A question derived from this purpose: ● Are there correlations with variables among existing analysis methods and in that case, what are they? Method – To answer the question posed, a closer look on the variables was needed to see which ones had a potential connection. After that empirical data of the chosen variables was gathered in the form of quantitative data from 7 open source projects from two years back. This was done by fetching commits from git, called commit history, and presenting the data in a suitable way in form of graphs. In the end the graphs were reviewed to detect possible patterns and then statistical formulas (Pearson's correlation coefficient) were used to calculate the exact correlation between the variables. A significance level was set at 0,001 and then p-value calculated. Median and mean value of the correlation coefficients of projects with p-value less than the significance level were also calculated. Findings – Two variables were inspected in the end, number of authors and number of logical couplings for the same file, and were made into a new analysis method with a graph. Through the graph analysis the methods seem to vary together. The graph shows a clear pattern that as more people work a module the more logical couplings will increase. For 6 out of 7 of the projects analyzed, the p-value was less than the significance level set from the beginning, meaning 6 coefficients were highly statistically significant. It was only for five out of these 6 that a positive coefficient was calculated. For the 6 projects with p-value less than significance the mean correlation coefficient was 0.41 and median 0.63, which both indicate a positive correlation between number of authors and number of logical couplings. Implications – Projects that have several people working on a module should watch out for logical couplings on that same module. Perhaps preventative measures can be made to ensure that people watch out for these logical couplings as more people start working on a module. Limitations – Only two analysis methods and their variables were inspected for further determination of a correlation, and there could be more correlations that are missing. Furthermore, the 7 projects that were used as data were open source and therefore the result from this study may not be the same as for closed source projects.
|
254 |
A STUDY ON THE DCC-GARCH MODEL’S FORECASTING ABILITY WITH VALUE-AT-RISK APPLICATIONS ON THE SCANDINAVIAN FOREIGN EXCHANGE MARKETAndersson-Säll, Tim, Lindskog, Johan January 2019 (has links)
This thesis has treated the subject of DCC-GARCH model’s forecasting ability and Value-at- Risk applications on the Scandinavian foreign exchange market. The estimated models were based on daily opening foreign exchange spot rates in the period of 2004-2013, which captured the information in the financial crisis of 2008 and Eurozone crisis in the early 2010s. The forecasts were performed on a one-day rolling window in 2014. The results show that the DCC-GARCH model accurately predicted the fluctuation in the conditional correlation, although not with the correct magnitude. Furthermore, the DCC-GARCH model shows good Value-at-Risk forecasting performance for different portfolios containing the Scandinavian currencies.
|
255 |
Medidas de centralidade em redes complexas: correlações, efetividade e caracterização de sistemas / Centrality measures in complex networks: correlations, effectiveness and characterization of systemsRonqui, José Ricardo Furlan 19 February 2014 (has links)
Centralidades são medidas desenvolvidas para determinar a importância dos nós e ligações, utilizando as características estruturais das redes para esta finalidade. As medidas de centralidade são, portanto, essenciais no estudo de redes complexas pois os sistemas representados por elas geralmente são formados por muitos elementos, e com isso, torna-se inviável estudar individualmente cada um deles; dessa forma é necessário identificar os nós e ligações que são mais relevantes em cada situação. Todavia, com o surgimento de ideias diferentes de como esses elementos podem ser importantes, diversas medidas foram propostas com o intuito de evidenciar elementos que passam despercebidos pelas demais. Neste trabalho utilizamos a correlação de Pearson para avaliar o quão semelhantes são as classificações fornecidas pelas centralidades para redes representando sistemas reais e modelos teóricos. Para avaliar a efetividade das medidas e como elas afetam cada sistema, atacamos as redes usando as centralidades como indicadores para a ordem de remoção dos nós e ligações. Procurando caracterizar as redes usando suas diferenças estruturais, realizamos uma análise de componentes principais empregando as correlações entre os pares de centralidade como características de cada sistema. Nossos resultados mostraram que na maioria dos casos medidas distintas estão correlacionadas, o que indica que em geral os mesmos elementos são evidenciados pelas diferentes centralidades; também observamos que as correlações são mais fortes nos modelos do que nos sistemas reais. Os ataques mostraram que medidas fortemente correlacionadas podem influenciar as redes de maneiras distintas, evidenciando a importância do conjunto de elementos selecionados por cada medida. Nosso último resultado demonstra que as correlações entre os pares de centralidades podem ser utilizados tanto para a diferenciação e caracterização de redes quanto na avaliação de modelos que representem melhor a estrutura de um sistema específico. / Centrality measures were developed to evaluate the importance of nodes and links based on the structure of networks. Centralities are essential in the study of networks because these systems are usually large, which make manual analysis of all nodes and links impossible; therefore recognizing such elements is a vital task. As nodes and links can be considered essential by different reasons, a large number of measures were proposed to identify important elements that were not highlighted by the other ones. In our study, we use Pearson\'s correlation coefficient to measure the similarity between rankings of nodes and links provided by different centralities for real and model based networks. We also perform attacks to networks, using these rankings to determine the order of removal of nodes and links, intending to evaluate and compare the efficiency and how the systems react to attacks guided by different centralities. Finally, we use the correlation coefficients between the pairs of centralities as properties of networks, and perform a principal component analysis with them, to evaluate if differences among network structures can be detected from correlations. Our results showed that centrality measures are frequently correlated, which means that the same elements can be highlighted by different centralities. We also noticed that the correlation coefficients are larger in models than in real world networks. The results of the attacks experiment showed that even when two measures are highly correlated, they can affect networks in distinct ways, meaning that the group of the nodes and links provided by each measure are relevant for the study of networks systems. Our last result evidenced that correlations among centrality measures can be used for characterization of networks and to evaluate how well models represent them.
|
256 |
A Unitary Perturbation Theory Approach to Real-Time Evolution in the Hubbard ModelKreye, Manuel 23 October 2019 (has links)
No description available.
|
257 |
Relationships Among Teachers' Attitudes, Behaviors Toward English Language Learners, Experience, and TrainingMitchell, Sandra 01 January 2016 (has links)
Public school teachers must meet the unique needs of English language learners (ELLs) in the general education classroom. There is a need to understand teacher attitudes toward ELLs because attitudes can explain and influence teacher behavior and professional practice. The purpose of this quantitative study was to examine the relationships between attitudes and behavior with years of experience as well as professional development among teachers working with ELLs. Sociocultural, situational learning, and second language acquisition theories provided the theoretical foundation for the study. Data were collected from 286 teachers using the Teacher Attitudes Toward English-as-a-Second-Language Survey. Analyses included descriptive statistics, correlational analysis, independent sample t tests, and Mann-Whitney U test. Results indicated a significant, direct correlation between teachers' years of experience and their attitudes regarding coursework modifications. The independent sample t tests indicated significant differences in a subscale of the variable teaching behavior between participants who had and had not received adequate training. In addition, significant differences in teachers' attitudes existed among those teachers between participants who had and had not received professional development. The study can effect social change at the local site by fostering an increased understanding of how experience and professional development influences teachers' attitudes toward inclusion and behaviors toward ELLs, thereby highlighting the importance of professional development and experience for meeting the needs of ELL students.
|
258 |
Phenomenology of Charged Higgs Bosons and B-meson DecaysEriksson, David January 2009 (has links)
For more than 30 years the Standard Model has been the theoretical foundation for particle physics. The theory has been verified successfully by experimental tests. Its biggest shortcoming is the non-discovery of the Higgs boson,responsible for giving the other particles masses. Despite its success there are hints that the Standard Model is not the complete theory and many extensions of it, such as supersymmetry, have been proposed. Extended theories often predict the existence of a charged Higgs boson and its detection will be a clear sign of physics beyond the Standard Model. The main focus in this thesis is on various phenomenological aspects of the charged Higgs boson. For favorable mass and couplings direct detection is shown to be possible at the Large Hadron Collider in production with an associated W boson. It is also shown how a light charged Higgs can have measurable effects on spin correlations in decays of pair-produced top quarks. The charged Higgs boson can also be seen indirectly, in for example B-meson decays, which can be used to put constraints on its mass and fermion couplings. Exclusion limits in two supersymmetric models are given together with a comparison with the discovery potentials for the LHC experiments. A tool for calculating properties, such as masses and decays, of both charged and neutral Higgs bosons in the Two-Higgs-Doublet Model is also presented. B-meson decays can also be used to test aspects of the strong interaction. Part of this thesis deals with improving and applying phenomenological models to B-meson decays. Although these models are not derived from first principles, their success shows that they capture important features of non-perturbative strong interactions.
|
259 |
Electrostatics of the Binding and Bending of Lipid Bilayers: Charge-Correlation Forces and Preferred CurvaturesLi, Yang January 2004 (has links)
Lipid bilayers are key components of biomembranes; they are self-assembled two-dimensional structures, primarily serving as barriers to the leakage of cell's contents. Lipid bilayers are typically charged in aqueous solution and may electrostatically interact with each other and with their environment. In this work, we investigate electrostatics of charged lipid bilayers with the main focus on the binding and bending of the bilayers.
We first present a theoretical approach to charge-correlation attractions between like-charged lipid bilayers with neutralizing counterions assumed to be localized to the bilayer surface. In particular, we study the effect of nonzero ionic sizes on the attraction by treating the bilayer charges (both backbone charges and localized counterions) as forming a two-dimensional ionic fluid of hard spheres of the same diameter <i>D</i>. Using a two-dimensional Debye-Hückel approach to this system, we examine how ion sizes influence the attraction. We find that the attraction gets stronger as surface charge densities or counterion valency increase, consistent with long-standing observations. Our results also indicate non-trivial dependence of the attraction on separations <i>h</i>: The attraction is enhanced by ion sizes for <i>h</i> ranges of physical interest, while it crosses over to the known <i>D</i>-independent universal behavior as <i>h</i> → ∞; it remains finite as <i>h</i> → 0, as expected for a system of finite-sized ions.
We also study the preferred curvature of an asymmetrically charged bilayer, in which the inner leaflet is negatively charged, while the outer one is neutral. In particular, we calculate the relaxed area difference Δ <i>A</i><sub>0</sub> and the spontaneous curvature <i>C</i><sub>0</sub> of the bilayer. We find Δ <i>A</i><sub>0</sub> and <i>C</i><sub>0</sub> are determined by the balance of a few distinct contributions: net charge repulsions, charge correlations, and the entropy associated with counterion release from the bilayer. The entropic effect is dominant for weakly charged surfaces in the presence of monovalent counterions only and tends to expand the inner leaflet, leading to negative Δ <i>A</i><sub>0</sub> and <i>C</i><sub>0</sub>. In the presence of even a small concentration of divalent counterions, however, charge correlations counterbalance the entropic effect and shrink the inner leaflet, leading to positive Δ <i>A</i><sub>0</sub> and <i>C</i><sub>0</sub>. We outline biological implications of our results.
|
260 |
Optimization of a petroleum producing assets portfolio: development of an advanced computer modelAibassov, Gizatulla 15 May 2009 (has links)
Portfolios of contemporary integrated petroleum companies consist of a few dozen
Exploration and Production (E&P) projects that are usually spread all over the world.
Therefore, it is important not only to manage individual projects by themselves, but to also
take into account different interactions between projects in order to manage whole
portfolios. This study is the step-by-step representation of the method of optimizing
portfolios of risky petroleum E&P projects, an illustrated method based on Markowitz’s
Portfolio Theory. This method uses the covariance matrix between projects’ expected return
in order to optimize their portfolio.
The developed computer model consists of four major modules. The first module
generates petroleum price forecasts. In our implementation we used the price forecasting
method based on Sequential Gaussian Simulation. The second module, Monte Carlo,
simulates distribution of reserves and a set of expected production profiles. The third module
calculates expected after tax net cash flows and estimates performance indicators for each
realization, thus yielding distribution of return for each project. The fourth module estimates
covariance between return distributions of individual projects and compiles them into portfolios. Using results of the fourth module, analysts can make their portfolio selection
decisions.
Thus, an advanced computer model for optimization of the portfolio of petroleum
assets has been developed. The model is implemented in a MATLAB® computational
environment and allows optimization of the portfolio using three different return
measures (NPV, GRR, PI). The model has been successfully applied to the set of
synthesized projects yielding reasonable solutions in all three return planes. Analysis of
obtained solutions has shown that the given computer model is robust and flexible in
terms of input data and output results. Its modular architecture allows further inclusion
of complementary “blocks” that may solve optimization problems utilizing different
measures (than considered) of risk and return as well as different input data formats.
|
Page generated in 0.0253 seconds