Spelling suggestions: "subject:"computational"" "subject:"eomputational""
1 
Heuristic Algorithms for Agnostically Identifying the Globally Stable and Competitive Metastable Morphologies of Block Copolymer MeltsTsai, Carol Leanne 07 March 2019 (has links)
<p> Block copolymers are composed of chemically distinct polymer chains that can be covalently linked in a variety of sequences and architectures. They are ubiquitous as ingredients of consumer products and also have applications in advanced plastics, drug delivery, advanced membranes, and next generation nanolithographic patterning. The wide spectrum of possible block copolymer applications is a consequence of block copolymer selfassembly into periodic, mesoscale morphologies as a function of varying block composition and architecture in both melt and solution states, and the broad spectrum of physical properties that such mesophases afford. </p><p> Materials exploration and discovery has traditionally been pursued through an iterative process between experimental and theoretical/computational collaborations. This process is often implemented in a trialanderror fashion, and from the computational perspective of generating phase diagrams, usually requires some existing knowledge about the competitive phases for a given system. SelfConsistent Field Theory (SCFT) simulations have proven to be both qualitatively and quantitatively accurate in the determination, or forward mapping, of block copolymer phases of a given system. However, it is possible to miss candidates. This is because SCFT simulations are highly dependent on their initial configurations, and the ability to map phase diagrams requires a priori knowledge of what the competing candidate morphologies are. The unguided search for the stable phase of a block copolymer of a given composition and architecture is a problem of global optimization. SCFT by itself is a local optimization method, so we can combine it with populationbased heuristic algorithms geared at global optimization to facilitate forward mapping. In this dissertation, we discuss the development of two such methods: Genetic Algorithm + SCFT (GASCFT) and Particle Swarm Optimization + SCFT (PSOSCFT). Both methods allow a population of configurations to explore the space associated with the numerous states accessible to a block copolymer of a given composition and architecture. </p><p> GASCFT is a realspace method in which a population of SCFT field configurations “evolves” over time. This is achieved by initializing the population randomly, allowing the configurations to relax to local basins of attraction using SCFT simulations, then selecting fit members (lower free energy structures) to recombine their fields and undergo mutations to generate a new “generation” of structures that iterate through this process. We present results from benchmark testing of this GASCFT technique on the canonical AB diblock copolymer melt, for which the theoretical phase diagram has long been established. The GASCFT algorithm successfully predicts many of the conventional mesophases from random initial conditions in large, 3dimensional simulation cells, including hexagonallypacked cylinders, BCCpacked spheres, and lamellae, over a broad composition range and weak to moderate segregation strength. However, the GASCFT method is currently not effective at discovery of network phases, such as the DoubleGyroid (GYR) structure. </p><p> PSOSCFT is a reciprocal space approach in which Fourier components of SCFT fields near the principal shell are manipulated. Effectively, PSOSCFT facilitates the search through a space of reciprocalspace SCFT seeds which yield a variety of morphologies. Using intensive free energy as a fitness metric by which to compare these morphologies, the PSOSCFT methodology allows us to agnostically identify lowlying competitive and stable morphologies. We present results for applying PSOSCFT to conformationally symmetric diblock copolymers and a miktoarm star polymer, AB<sub>4</sub>, which offers a rich variety of competing sphere structures. Unlike the GASCFT method we previously presented, PSOSCFT successfully predicts the double gyroid morphology in the ABdiblock. Furthermore, PSOSCFT successfully recovers the A<sub> 15</sub> morphology at a composition where it is expected to be stable in the miktoarm system, as well as several competitive metastable candidates, and a new sphere morphology belonging to the hexagonal space group 191, which has not been seen before in polymer systems. Thus, we believe the PSOSCFT method provides a promising platform for screening for competitive structures in a given block copolymer system.</p><p>

2 
Molecular Dynamics Study of Polymers and Atomic ClustersSponseller, Daniel Ray 23 March 2018 (has links)
<p> This dissertation contains investigations based on Molecular Dynamics (MD) of a variety of systems, from small atomic clusters to polymers in solution and in their condensed phases. The overall research is divided in three parts. First, I tested a new thermostat in the literature on the thermal equilibration of a small cluster of LennardJones (LJ) atoms. The proposed thermostat is a Hamiltonian thermostat based on a logarithmic oscillator with the outstanding property that the mean value of its kinetic energy is constant independent of the mass and energy. I inspected several weakcoupling interaction models between the LJ cluster and the logarithmic oscillator in 3D. In all cases I show that this coupling gives rise to a kinetic motion of the cluster center of mass without transferring kinetic energy to the interatomic vibrations. This is a failure of the published thermostat because the temperature of the cluster is mainly due to vibrations in small atomic clusters This logarithmic oscillator cannot be used to thermostat any atomic or molecular system, small or large. </p><p> The second part of the dissertation is the investigation of the inherent structure of the polymer polyethylene glycol (PEG) solvated in three different solvents: water, water with 4% ethanol, and ethyl acetate. PEG with molecular weight of 2000 Da (PEG<sub>2000</sub>) is a polymer with many applications from industrial manufacturing to medicine that in bulk is a paste. However, its structure in very dilute solutions deserved a thorough study, important for the onset of aggregation with other polymer chains. I introduced a modification to the GROMOS 54A7 force field parameters for modeling PEG<sub>2000</sub> and ethyl acetate. Both force fields are new and have now been incorporated into the database of known residues in the molecular dynamics package Gromacs. This research required numerous high performance computing MD simulations in the ARGO cluster of GMU for systems with about 100,000 solvent molecules. My findings show that PEG<sub>2000</sub> in water acquires a balllike structure without encapsulating solvent molecules. In addition, no hydrogen bonds were formed. In water with 4% ethanol, PEG<sub>2000</sub> acquires also a balllike structure but the polymer ends fluctuate folding outward and onward, although the general shape is still a compact balllike structure. </p><p> In contrast, PEG<sub>2000</sub> in ethyl acetate is quite elongated, as a very flexible spaghetti that forms kinks that unfold to give rise to folds and kinks in other positions along the polymer length. The behavior resembles an ideal polymer in a &thetas; solvent. A Principal Component Analysis (PCA) of the minima composing the inherent structure evidences the presence of two distinct groups of balllike structures of PEG<sub>2000</sub> in water and water with 4% ethanol. These groups give a definite signature to the solvated structure of PEG<sub>2000</sub> in these two solvents. In contrast, PCA reveals several groups of avoided states for PEG<sub>2000</sub> in ethyl acetate that disqualify the possibility of being an ideal polymer in a &thetas; solvent. </p><p> The third part of the dissertation is a work in progress, where I investigate the condensed phase of PEG<sub>2000</sub> and study the interface between the condensed phase and the three different solvents under study. With a strategy of combining NPT MD simulations at different temperatures and pressures, PEG<sub> 2000</sub> condensed phase displays the experimental density within a 1% discrepancy at 300 K and 1 atm. This is a very encouraging result on this ongoing project. </p><p>

3 
Systematic parameterized complexity analysis in computational phonologyWareham, Harold 20 November 2017 (has links)
Many computational problems are NPhard and hence probably do not have fast, i.e., polynomial time, algorithms. Such problems may yet have nonpolynomial time algorithms, and the nonpolynomial time complexities of these algorithm will be functions of particular aspects of that problem, i.e., the algorithm's running time is upper bounded by f (k) xᶜ, where f is an arbitrary function, x is the size of the input x to the algorithm, k is an aspect of the problem, and c is a constant independent of x and k. Given such algorithms, it may still be possible to obtain optimal solutions for large instances of NPhard problems for which the appropriate aspects are of small size or value. Questions about the existence of such algorithms are most naturally addressed within the theory of parameterized computational complexity developed by Downey and Fellows.
This thesis considers the merits of a systematic parameterized complexity analysis in which results are derived relative to all subsets of a specified set of aspects of a given NPhard problem. This set of results defines an “intractability map” that shows relative to which sets of aspects algorithms whose nonpolynomial time complexities are purely functions of those aspects do and do not exist for that problem. Such maps are useful not only for delimiting the set of possible algorithms for an NPhard problem but also for highlighting those aspects that are responsible for this NPhardness.
These points will be illustrated by systematic parameterized complexity analyses of problems associated with five theories of phonological processing in natural languages—namely, Simplified Segmental Grammars, finitestate transducer based rule systems, the KIMMO system, Declarative Phonology, and Optimality Theory. The aspects studied in these analyses broadly characterize the representations and mechanisms used by these theories. These analyses suggest that the computational complexity of phonological processing depends not on such details as whether a theory uses rules or constraints or has one, two, or many levels of representation but rather on the structure of the representationrelations encoded in individual mechanisms and the internal structure of the representations. / Graduate

4 
Optimal MultiTime Period Gasoline BlendingKulkarni, Shefali 08 1900 (has links)
<p>Multi Time period Gasoline blending is an example of multipurpose production system that is designed to produce multiple products by switching from one product to another. Various factors such as demand for gasoline, availability of supply component, and blend recipes vary with time. Task of the gasoline blender is to decide how much of each product to produce at what point in time (lot sizing) and what should be the blend recipe in order to minimize overall cost (optimize the blend recipe) . The production plans need to account for setup times between blends and to minimize switching between different product blends. Traditional optimization techniques provide a single optimal solution. This research is using evolutionary optimization algorithm called differential evolution to identify multiple solutions t hat all have the same total cost but offer the blend planner multiple choices in terms of how much of a given product to blend at what point in time.</p> / Master of Applied Science (MASc)

5 
Controlling the Dual Cascade of Twodimensional TurbulenceFarazmand, Mohammad M. 04 1900 (has links)
<p>p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.5px Times} p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; font: 8.0px Times} span.s1 {font: 12.0px Times} span.s2 {font: 8.0px Times} span.s3 {font: 6.0px Times}</p> <p>The KraichnanLeithBatchelor (KLB) theory of statistically stationary forced homogeneous isotropic 2D turbulence predicts the existence of two inertial ranges: an energy inertial range with an energy spectrum scaling of k⁻³ , and an enstrophy inertial range with an energy spectrum scaling of k⁻³. However, unlike the analogous Kolmogorov theory for 3D turbulence, the scaling of the enstrophy range in 2D turbulence seems to be Reynolds number dependent: numerical simulations have shown that as Reynolds number tends to infinity the enstrophy range of the energy spectrum converges to the KLB prediction, i.e. E ~ k⁻³.</p> <p>We develop an adjointequation based optimal control approach for controlling the energy spectrum of incompressible fluid flow. The equations are solved numerically by a highly accurate method. The computations are carried out on parallel computers in order to achieve a reasonable computational time.</p> <p>The results show that the timespace structure of the forcing can significantly alter the scaling of the energy spectrum over inertial ranges. This effect has been neglected in most previous numerical simulations by using a randomphase forcing. A careful analysis of the resulting forcing suggests that it is unlikely to be realized in nature, or by a simple numerical model. Therefore, we conjecture that the dual cascade is unlikely to be realizable at moderate</p> <p>Reynolds numbers without resorting to forcings that depend on the instantaneous flow structure or are not bandlimited.</p> / Master of Applied Science (MASc)

6 
Single Machine Total Weighted Tardiness With Release DatesJing, Wei 12 1900 (has links)
<p>The single machine total weighted tardiness with release dates problem is known to be strongly NPhard. With a new lower bounding scheme and a new upper bounding scheme, we get an efficient branch and bound algorithm. In the paper, we first introduce the history of the problem and its computational complexity. Second, the lower bounding schemes and the upper bounding schemes are described in detail. We also present all the dominance rules used in the branch and bound algorithm to solve the problem.</p> <p>In the dominance rules part, we describe the labeling scheme and suggest a data structure for a dominance rule.</p> <p>Finally, we implement the branch and bound algorithm in C++ for the problem with all the techniques introduced above. We present numerical results produced by the program. Using the same instance generating scheme and the test instances from Dr. Jouglet, our results show that this branch and bound method outperforms the previous approaches specialized for the problem.</p> / Master of Science (MS)

7 
Rapid Reoptimization of Prostate IntensityModulated Radiation Therapy Using Regularized Linear ProgrammingKhalajipirbalouti, Maryam 04 1900 (has links)
<p>This thesis presents a new linear programming approach for reoptimizing a intensity modulated radiation therapy (IMRT) treatment plan, in order to compensate for interfraction tissue deformations. Different formulations of the problem involve different constraints, but a common constraint that is difficult to handle mathematically is the constraint that the dose be deliverable using a small number of multileaf collimator positions. MLC leaves are tungsten alloy attenuators which can be moved in and out to shape of the radiation aperture. Since leaves are solid, photon fluence profiles will follow a staircase function and this constraint is not convex, and difficult to formulate. In this thesis, we propose a relaxation of this constraint to the `1norm of the differences between adjacent radiation fluxes. With the appropriate bound, this constraint encourages the dose to be deliverable with a series of shrinking or growing openings between the leaves. Such a solution can be made realizable by rounding, which is beyond the scope of this thesis. This approach has been tested on an anonymized prostate cancer treatment plan with simulated deformations. Without rounding, solutions were obtained in five of nine cases, in less than 4 to 5 seconds of computation on a NEOS server. Solved cases demonstrated excellent target coverage (minimum dose in the target was 95% of the prescribed dose) and organ sparing (mean dose in normal tissues was below 25% of the prescribed dose).</p> / Master of Science (MSc)

8 
Bandwidth Minimization for Sparse MatricesGoyal, Virendra K. 11 1900 (has links)
<p>A short survey of recent developments in sparse matrix techniques is presented in this project. One of the problems in this area is concerned with bandwidth reduction. Several algorithms for finding symmetric row and column permutations for a given sparse symmetric matrix, such that the resulting matrix has minimum bandwidth, are discussed. A few modified algorithms yielding a better bandwidth reduction are also presented. Six well known Example problems are utilized to illustrate the work.</p> / Master of Science (MS)

9 
CYSDEM User's ManualNaguib, Mohamed 05 1900 (has links)
<p>CYSDEM is a new FORTRAN based simulation language. It is used to simulate business information systems. The main feature of CYSDEM is that it concentrates on delays and distortions, to which information in actual systems is subjected to. Therefore, CYSDEM allows a more realistic representation of the movement of data within an information system.</p> <p>CYSDEM is based on a schematic representation of the information system under consideration, called the CYBERSTRUCTURE. The CYBERSTRUCTURE is constructed by the user. It helps the user in identifying relevant information.</p> <p>Any new computer language, requires careful verification and complete documentation. This report contains an introduction to simulation and a user's guide. Partial verification of CYSDEM was performed distribution system from Forrester's "Industrial Dynamics". The example is explained and the results are included in this report.</p> / Master of Science (MS)

10 
An Algorithm for the Solution of ZeroOne Resource Allocation ProblemsMowla, Golam 09 1900 (has links)
<p>An algorithm is developed for discrete optimization of zeroone resource allocation problems. A single constraint problem is first formulated in dynamic programming. This formulation then undergoes a number of modifications to develop the algorithm. This algorithm leads to a significant reduction in computational requirements as compared to the dynamic programming method. Three theorems and several lemmas are proved which are central in making the algorithm efficient. Different relevant features are included in the study to extend the algorithm to solve problems with more than one constraint.</p> / Master of Science (MS)

Page generated in 0.119 seconds