Spelling suggestions: "subject:"large"" "subject:"marge""
71 |
Improvement in techniques for understanding the large scale structure of the UniverseGagrani, Praful January 1900 (has links)
Master of Science / Department of Physics / Lado Samushia / Part 1: The redshift-space bispectrum (three point statistics) of galaxies can be used to measure key cosmological parameters. In a homogeneous Universe, the bispectrum is a function of five variables and unlike its two point statistics counterpart -- the power spectrum, which is a function of only two variables -- is difficult to analyse unless the information is somehow reduced. The most commonly considered reduction schemes rely on computing angular integrals over possible orientations of the bispectrum triangle thus reducing it to sets of functions of only three variables describing the triangle shape. We use Fisher information formalism to study the information loss associated with this angular integration. We find that most of the information is in the azimuthal averages of the first three even multipoles. This suggests that the bispectrum of every configuration can be reduced to just three numbers (instead of a 2D function) without significant loss of cosmologically relevant information.
Part 2: One way of enhancing the cosmological information extracted from the clustering of galaxies is by weighting the galaxy field. The most widely used weighting schemes assign weights to galaxies based on the average local density in the region and their bias with respect to the dark matter field. They are designed to minimize the fractional variance of the galaxy power-spectrum. We demonstrate that the currently used bias dependent weighting scheme can be further optimized for specific cosmological parameters.
Part 3: Choice of the box-size of a cosmological simulation involves a crucial trade-off between accuracy and complexity. We use Lagrangian perturbation theory to study the effects of box size on the predicted power spectrum and Baryon Acoustic Oscillation ruler. We find that although the optimal size depends on the final redshift of evolution, in general, the 2-point statistics of relevant scales is fairly accurate for a simulation box-size of length greater than 1000 Mpc.
|
72 |
Structure formation within the cosmic webEardley, Elizabeth January 2016 (has links)
In this era of high-precision cosmology we are able to measure and predict properties of the large-scale structure of our Universe to a fine degree. However we still lack a clear and tested understanding of the effects of the large-scale environments on galaxies and their host halos. This thesis focuses on bettering our understanding of this issue by investigating the dependence of galaxies and halos on their location within the cosmic web. An algorithm based on the tidal tensor prescription is developed and applied to the MDR1 1 (h-1Gpc)3 dark matter simulation to classify the geometric environment of every location in the simulated volume as one of the four components of the cosmic web; voids, sheets, filaments and knots. Conditional halo mass functions are extracted to investigate the influence of tidal forces on the abundances and mass distribution of dark matter halos. A theoretical framework based on Gaussian statistics is presented and used to derive predictions for halo abundances in different geometric environments. The Gaussian theory predicts no coupling of tidal forces and, hence, that the halo mass function is independent of geometric environment for a given local mass density. It is shown that the halo mass functions extracted from the simulation are fully consistent with this picture. It is then shown how this method of classifying geometric environments can be extended to observational datasets. The Galaxy And Mass Assembly (GAMA) spectroscopic redshift survey, with its wide field and high completeness, is excellently suited to this study. The geometric environments of the three equatorial GAMA fields are classified, following a thorough analysis of the additional uncertainties introduced when moving to observational datasets. Additionally, the geometric environments of the GAMA galaxies and groups are classified, allowing the influence of the cosmic web on large-scale structure to be investigated. Both the galaxy luminosity function and the group mass function within the observed cosmic web are studied and no evidence of a direct impact of the web is seen. It is found that all modulations can be fully attributed to the indirect dependence of these properties on the local matter overdensity. Whilst these results indicate that there is no strong dependence of the scalar properties of large-scale structure on geometric environment, the final investigation of this thesis presents an attempt to look in more detail at the environmental dependence of stellar properties by investigating stellar-formation histories within the cosmic web.
|
73 |
Lacome: a cross-platform multi-user collaboration system for a shared large displayLiu, Zhangbo 05 1900 (has links)
Lacome is a multi-user cross-platform system that supports collaboration in a shared large screen display environment. Lacome allows users to share their desktops or application windows using any standard VNC server. It supports multi-user concurrent interaction on the public shared display as well as input redirection so users can control each other's applications. La-come supports separate types of interaction through a Lacome client for window management tasks on the shared display(move, resize, iconify, de-iconify) and for application interactions through the VNC servers. The system architecture provides for Publishers that share information and Navigators that access information. A Lacome client can have either or both, and can initiate additional Publishers on other VNC servers that may not be Lacome clients. Explicit access control policies on both the server side the client side provide a flexible framework for sharing. The architecture builds on standard cross-platform components such as VNC and JRE. Interaction techniques used in the window manager ensure simple and transparent multi-user interactions for managing the shared display space. We illustrate the design and implementation of Lacome and provide insights from initial user experience with the system. / Science, Faculty of / Computer Science, Department of / Graduate
|
74 |
Development and implementation of robust large deformation and contact mechanics capabilities in process modelling of compositesOsooly, Amir 05 1900 (has links)
Autoclave processing of large scale, one-piece structural parts made of carbon fiber-reinforced polymer composite materials is the key to decreasing manufacturing costs while at the same time increasing quality. Nonetheless, even in manufacturing simple flat parts, residual strains and stresses are unavoidable. For structural design purposes and to aid in the assembly procedures, it is desirable to have proven numerical tools that can be used to predict these residual geometrical and material properties in advance, thus avoid the costly experimental trial and error methods.
A 2-D finite element-based code, COMPRO, has previously been developed in-house for predicting autoclave process-induced deformations and residual stresses in composite parts undergoing an entire cure cycle. To simulate the tool-part interaction, an important source of residual deformations/stresses, COMPRO used a non-zero thickness elastic shear layer as its only interface option. Moreover, the code did not account for the large deformations and strains and the resulting nonlinear effects that can arise during the early stages of the cure cycle when the material is rather compliant.
In the present work, a contact surface employing a penalty method formulation is introduced at the tool-part interface. Its material-dependent parameters are a function of temperature, degree of cure, pressure and so forth. This makes the stick-slip condition plus separation between the part and the tool possible. The large displacements/rotations and large shear strains that develop at the early stages of the cure cycle when the resin has a very low elastic modulus provided the impetus to include a large strain/deformation option in COMPRO. A new “co-rotational stress formulation” was developed and found to provide a robust method for numerical treatment of very large deformation/strain problems involving anisotropic materials of interest here.
Several verification and validation examples are used to calibrate the contact interface parameters and to demonstrate the correctness of implementation and the accuracy of the proposed method. A number of comparisons are made with exact solutions, other methods, other experiments and the same models in other commercial codes. Finally, several interesting cases are examined to explore the results of COMPRO predictions with the added options. / Applied Science, Faculty of / Civil Engineering, Department of / Graduate
|
75 |
The Effect of Intact Protein from Foods and Phenylalanine Free Medical Foods on Large Neutral Amino Acids in Patients with Phenylketonuria.Berry, Ann M, Nucci, Anita M, Douglas, Teresa D, Henes, Sarah T 08 June 2017 (has links)
Objective: The primary aim of this retrospective cohort study was to determine the association between the source of dietary protein intake and the sum of plasma concentration of large neutral amino acids (LNAA) in patients with Phenylketonuria (PKU). A secondary aim of the study was to examine the effect of dietary compliance on plasma concentration of LNAA. Methods: The analysis included combined participant data from two previous studies conducted at the Emory University School of Medicine. Subjects are males (n=34) and females (n=43) with PKU ages 4-50 years. A Student t-test was used to compare total combined plasma LNAA (excluding tryptophan and phenylalanine) by dietary compliance status (alpha=0.05). Correlation statistics were used to determine the association between the ratio of reported intact food protein to medical food protein on plasma levels of LNAA. Multiple regression analysis was used to examine the contribution of intact protein to medical food protein ratio and other variables to plasma LNAA. Results: The median ratio of intact protein to medical food protein reported was 0.354 (IQR: 0.188, 0.914). Median percent of PHE intake over the PHE intake recommendation was 31.64 (Interquartile range [IQR]; 7.44, 104.98). Plasma concentration of LNAA did not differ significantly between those with plasma PHE levels within the therapeutic range μmol/L (compliant; 611.7 μmol/L [n=19]) vs levels above the therapeutic range (non-compliant; 595.3 μmol/L [n=47]); p=0.613). There was an inverse marginal correlation between the ratio of intact protein to medical food protein and plasma concentration of LNAA for those who were compliant (r = -0.436, r = 0.1) although the association was not statistically significant (p=0.08). No correlation was found for patients who were non-compliant. Regression analysis revealed that plasma concentration of LNAA was not significantly affected by the ratio of intact protein to medical food protein ratio, age, or gender. Conclusions: Although not statistically significant, a negative trend was observed between plasma LNAA concentration and the intact protein to medical food protein ratio in patients compliant with the PHE prescription. This suggests that the ratio of intact dietary protein to protein coming from medical food, as reported by patient diet records, may promote increased plasma LNAA levels in the effective treatment of PKU. The majority of the sample (74%) were non-compliant with diet based on plasma PHE levels. Future studies are needed to determine the consequences of non-compliance by decreased intake of medical food protein or increased intake of intact protein on plasma LNAA concentration and downstream health effects.
|
76 |
Spatial History: Using Spatial Memory to Recall InformationLogan, Kevin Robert 13 December 2012 (has links)
Some computer users employ large displays, 6 or more monitors, in order to view a large amount of data on a single desktop at one time. This layout can be useful when the user is performing tasks in which they must view several different information sources at a time. For example, a user may be writing a paper in which they may be simultaneously typing a document, reading another paper, and view a spreadsheet. After the task is completed, the user may close all of the windows, however sometime later they may want to view a document associated with that task. A possible scenario is for the user to know that they were viewing an important document in their top left monitor, but they cannot remember which document. SpatialHistory looks to allow a user to recall which windows and documents were open at a certain time spatially. The user may query a particular region of a large display and SpatialHistory will report the windows that were open in that area. Through a user study, we conclude that i) some users organize their large displays in a spatial manner placing certain types of documents and windows in certain places and that ii) our tool has the potential to help users recall previously viewed windows based on a spatial memory of their desktop. / Master of Science
|
77 |
A Novel Two-Stage Adaptive Method for Estimating Large Covariance and Precision MatricesRajendran, Rajanikanth 08 1900 (has links)
Estimating large covariance and precision (inverse covariance) matrices has become increasingly important in high dimensional statistics because of its wide applications. The estimation problem is challenging not only theoretically due to the constraint of its positive definiteness, but also computationally because of the curse of dimensionality. Many types of estimators have been proposed such as thresholding under the sparsity assumption of the target matrix, banding and tapering the sample covariance matrix. However, these estimators are not always guaranteed to be positive-definite, especially, for finite samples, and the sparsity assumption is rather restrictive. We propose a novel two-stage adaptive method based on the Cholesky decomposition of a general covariance matrix. By banding the precision matrix in the first stage and adapting the estimates to the second stage estimation, we develop a computationally efficient and statistically accurate method for estimating high dimensional precision matrices. We demonstrate the finite-sample performance of the proposed method by simulations from autoregressive, moving average, and long-range dependent processes. We illustrate its wide applicability by analyzing financial data such S&P 500 index and IBM stock returns, and electric power consumption of individual households. The theoretical properties of the proposed method are also investigated within a large class of covariance matrices.
|
78 |
Large dam development and displacement : understanding reasons and dynamics for conflict over construction of Polihali dam in Mokhotlong DistrictLehema, Reitumetse Elizabeth 11 1900 (has links)
This dissertation explores the reasons behind the impasse over the implementation of the LHWP II between communities affected by the construction of the Polihali Dam in Mokhotlong District in Lesotho and the Lesotho Highlands Development Authority. This conflict which has threatened the implementation of the project has been looked at in terms of disagreements over the compensation package for the displacement and losses. Using an actor-oriented approach, the contention was that explanations of the impasse should go further than the simplistic explanations of compensation and non-compensation, but should focus on the dynamics of interaction between different actors involved in the Polihali case. The investigation of the particular context of the Phase II implementation reveals a complex arena; where different actors are pursuing different agendas and using the power at their disposal to achieve these agendas eventually struggled to find common ground.
It also reveals a need to incorporate the perspective of these actors into any explanation of the impasse. Using this approach, the investigation identifies several factors of which the compensation package is only a part which contributed to the conflict. Chief among these factors was how the LHDA handled the situation. Dislocations are traumatic experiences and require that those affected should be fully involved in decisions that affect their situation, yet the LHDA engagement failed on the basic participation principles. The authority modelled its approach to community engagement on tokenism and allowed people’s consultation, but disregarded their inputs in the final decision making. This resulted in a skewed understanding of society and losses resulting from dam construction, which caused tensions between the authority and the communities.
Therefore, this dissertation concludes that; while compensation was certainly a significant aspect in the impasse, the gap between what was compensated and expectations reflects a general lack of knowledge of society and the social dynamics guiding life in these societies, which arose from the failure of the approach adopted by the LHDA. The study therefore advocates a search for a new rural development narrative that will move away from assumption about the viability of small family farms. It also reveals a need to incorporate people affected by large dam development in all aspects of the project that affect their lives. / Dissertation(MSocSci (Development Studies))--University of Pretoria 2020. / Anthropology and Archaeology / MSocSci (Development Studies) / Unrestricted
|
79 |
SIMULATION OF LARGE STRAIN BEHAVIOUR OF ZICONIUM-2 ALLOYFu, Yue January 2016 (has links)
Zirconium is a widely applied material in nuclear industry and characterized by highly anisotropic mechanical behaviour. With the development of the nuclear industry, zirconium alloys have proven to be very important structural material used as pressure tubes in heavy water reactors, and thin-walled tubing in light water reactors due to improved corrosion resistance. To study the special property of zirconium, this thesis will evaluate different numerical models and will present an overview about the deformation mechanisms for zirconium alloys.
Twinning and De-twinning (TDT) and Predominant Twin Reorientation (PTR) models are both implemented in the EVPSC polycrystal model. Various self-consistent models as well as TDT and PTR twinning models are evaluated by studying large strain behavior of zirconium alloys slab under different deformation processes. The material parameters for those models are determined by virtue of the experimental uniaxial tension and compression along the normal direction (ND) as well as tension along the rolling direction (RD). The predictability of those models is assumed through comparison of macroscopic deformation behavior (stress strain curves and R values) and microscopic mechanical response (texture coefficients and lattice strain) between the predicted and experimental data. Numerical results reveal that, among the various models examined, the VPSC-TDT model with the Affine self-consistent scheme gives the best overall performance for zirconium alloys slab, TDT twinning model gives the better overall performance than the PTR twinning model. / Thesis / Master of Applied Science (MASc)
|
80 |
PEPDB Construction and Large-Scale Analysis of ESTSShen, Ling 07 1900 (has links)
The Protist EST program (PEP) aims to explore the diversity of eukaryotic genomes, in a systematic and comprehensive way. A main element of the PEP initiative is to establish a database, the Protist EST Database (PEPdb), which is the centerpiece of the PEP collaboration. The major functions of the PEPdb are management of the data generated by
PEP, analysis of these data, and to allow collected sequence information to be accessed via the Internet by PEP members or other users. In this project, a consistent and easy to use relational database was implemented. All information about PEP members, Publications, Libraries and ESTs can be stored in the database system. The operations are achieved by
a friendly user interface. This database stores about 10000 records and is displayed on the web site "http://info.biology.mcmaster.ca/ling/estHome.html" for demonstration. An analysis of ESTs from the ciliated protozoan Tetrahymena thermophila was undertaken. A total of 3740 non-redundant gene assemblies and singletons from TIGR were analyzed. These sequences have been compared against the NCBI non-redundant protein and nucleotide databases using BLASTX and BLASTN to identify putative genes. Of 850 highly significant matches with an expect value cut-off of 10^-20 , 35.5% represent genes previously cloned from T. thermophila, and 64.5% had significant similarity to genes from other organisms deposited in the NCBI. There are 26 sequences (3.1%) that matched signal transduction proteins, including Rac, Ras, MAPK, ERK1, PKC, cAMP and 14-3-3 (a protein involved in signal transduction, exocytosis and cell cycle regulation). This result indicates that T. thermophila likely encodes the MAPK/ERK signaling pathway. About 53
sequences (6.2%) matched to cytoskeleton proteins which were divided into two groups. The first group matched genes coding for microtubules, especially to tubulin genes. The other group matched to microfilament genes including one actin, three actin-related and one profilin proteins. There were no sequences similar to intermediate filaments. Comparison of the EST counts from one gene provide absolute estimates of mRNA expression levels. The most abundant genes represented are enolase, SerH3 and Tubulin. Among 850 highly significant similarities, 196 were restricted to the ciliophora. GRL and SerH are ciliate-specific genes. There were 508 sequences that had highly significant matches (expect
value < 10^-20) to human genes. Approximately 189 of them were present in humans but not found in the completely sequenced Saccharomyces cerevisiae. Based on Venn diagram analysis, T. thermophila contains abundant Eukaryotic specific proteins and many prokaryotic-like genes, and some metabolic enzymes in T. thermophila are also present in
plants. These results support the fact that T. thermophila is an excellent unicellular model system for gene discovery and functional analysis. / Thesis / Master of Science (MS)
|
Page generated in 0.0375 seconds