• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 209
  • 4
  • 4
  • 3
  • 1
  • Tagged with
  • 236
  • 236
  • 176
  • 176
  • 175
  • 21
  • 8
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Learning symmetry-preserving interatomic force fields for atomistic simulations

Batzner, Simon Lutz. January 2019 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, 2019 / Cataloged from PDF version of thesis. / Includes bibliographical references. / Machine-Learning Interatomic Force-Fields have shown great promise in increasing time- and length-scales in atomistic simulations while retaining the high accuracy of the reference calculations that they are trained on. Most proposed models aim to learn the potential energy surface of a system of atoms as a function of atomic coordinates and species and obtain the forces acting on the atoms as the negative of the gradient of the global energy with respect to the atomic positions. For the time evolution of an atomistic system in molecular dynamics, however, only atomic forces are required. This thesis examines the construction of a direct approach for learning atomic forces, thereby bypassing the need for learning an energy-based model. Predicting atomic forces directly requires the careful consideration of incorporating the symmetries of 3D space into the model. The construction of an efficient, direct, and symmetry-preserving deep learning model that can predict atomic forces in a fully end-to-end fashion is shown. The model's accuracy, its computational efficiency for training as well as its computational efficiency at time of prediction are evaluated. Finally, the approach is used in the simulation of different small organic molecules and the resulting Molecular Dynamics simulations are analyzed. / by Simon Lutz Batzner. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
142

Analysis of additive manufacturing in an automobile service part supply chain

Wei, Yijin,S. M.Massachusetts Institute of Technology. January 2018 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, 2018 / Cataloged from PDF version of thesis. / Includes bibliographical references (pages 67-68). / The traditional supply chain performance depends on the efficiency of mass production, the availability of productive low cost labor and the geometry and materials of the products. Additive manufacturing, on the other hand, bypasses all these constraints and reduces the number of stages in the supply chain by allowing local production of low volume parts of greater complexity. We develop an approach for assessing the total cost when additive manufacturing is integrated into the service-parts supply chain given a set of inputs that characterize the supply chain. Specifically, we present several simulation and optimization models to help companies decide the end-of-life strategy of low volume service parts. Through sensitivity analysis, we identify regions of parameters where additive manufacturing is preferred. Moreover, we find that service parts with high lost sales unit cost and low fixed and variable additive manufacturing costs are the most suitable for additive manufacturing. / by Yijin Wei. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
143

Applications of deep learning and computer vision in large scale quantification of tree canopy cover and real-time estimation of street parking

Cai, Bill Yang. January 2018 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, 2018 / Cataloged from PDF version of thesis. / Includes bibliographical references (pages 73-77). / A modern city generates a large volume of digital information, especially in the form of unstructured image and video data. Recent advancements in deep learning techniques have enabled effective learning and estimation of high-level attributes and meaningful features from large digital datasets of images and videos. In my thesis, I explore the potential of applying deep learning to image and video data to quantify urban tree cover and street parking utilization. Large-scale and accurate quantification of urban tree cover is important towards informing government agencies in their public greenery efforts, and useful for modelling and analyzing city ecology and urban heat island effects. We apply state-of-the-art deep learning models, and compare their performance to a previously established benchmark of an unsupervised method. / Our training procedure for deep learning models is novel; we utilize the abundance of openly available and similarly labelled street-level image datasets to pre-train our model. We then perform additional training on a small training dataset consisting of GSV images. We also employ a recently developed method called gradient-weighted class activation map (Grad-CAM) to interpret the features learned by the end-to-end model. The results demonstrate that deep learning models are highly accurate, can be interpretable, and can also be efficient in terms of data-labelling effort and computational resources. Accurate parking quantification would inform developers and municipalities in space allocation and design, while real-time measurements would provide drivers and parking enforcement with information that saves time and resources. We propose an accurate and real-time video system for future Internet of Things (IoT) and smart cities applications. / Using recent developments in deep convolutional neural networks (DCNNs) and a novel intelligent vehicle tracking filter, the proposed system combines information across multiple image frames in a video sequence to remove noise introduced by occlusions and detection failures. We demonstrate that the proposed system achieves higher accuracy than pure image-based instance segmentation, and is comparable in performance to industry benchmark systems that utilize more expensive sensors such as radar. Furthermore, the proposed system can be easily configured for deployment in different parking scenarios, and can provide spatial information beyond traditional binary occupancy statistics. / by Bill Yang Cai. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
144

A computational approach to urban economics

Chong, Shi Kai. January 2018 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, 2018 / Cataloged from PDF version of thesis. / Includes bibliographical references (pages 89-92). / Cities are home to more than half of the world population today and urbanization is one of this century's biggest drivers of global economic growth. The dynamics of the urban environment is thus an important question to investigate. In this thesis, techniques from statistical modeling, machine learning, data mining and econometrics are utilized to study digital traces of people's everyday lives. In particular, we investigated how people influence the economic growth of cities, as well as how the urban environment affect the decisions made by people. Focusing on the role of cities as centers of consumption, we found that a gravity model based on the availability of a large and diverse pool of amenities accurately explained human flows observed from credit card records. Investigation of the consumption patterns of individuals in Istanbul, Beijing and various metropolitan areas in the United States revealed a positive relationship between the diversity of urban amenities consumed and the city's economic growth. Taking the perspective of cities as hubs for information exchange, we modeled the interactions between individuals in the cities of Beijing and Istanbul using records of their home and work locations and demonstrated how cities which facilitate the mixing of diverse human capital are crucial to the flow of new ideas across communities and their productivity. This contributes to the body of evidence which supports the notion that efficient information exchange is the key factor that drives innovation. To investigate how urban environments shape people's decisions, we study the social influence city dwellers have on each other and showed how face-to-face interaction and information exchange across different residential communities can shape their behavior and increase the similarity of their financial habits and political views in Istanbul. / by Shi Kai Chong. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
145

Optimal design for experiments with mixtures /

Chan, Ling-yau. January 1986 (has links)
Thesis--Ph. D., University of Hong Kong, 1987.
146

The Bayesian validation metric : a framework for probabilistic model calibration and validation

Tohme, Tony. January 2020 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, May, 2020 / Cataloged from the official PDF of thesis. / Includes bibliographical references (pages 109-114). / In model development, model calibration and validation play complementary roles toward learning reliable models. In this thesis, we propose and develop the "Bayesian Validation Metric" (BVM) as a general model validation and testing tool. We show that the BVM can represent all the standard validation metrics - square error, reliability, probability of agreement, frequentist, area, probability density comparison, statistical hypothesis testing, and Bayesian model testing - as special cases while improving, generalizing and further quantifying their uncertainties. In addition, the BVM assists users and analysts in designing and selecting their models by allowing them to specify their own validation conditions and requirements. Further, we expand the BVM framework to a general calibration and validation framework by inverting the validation mathematics into a method for generalized Bayesian regression and model learning. We perform Bayesian regression based on a user's definition of model-data agreement. This allows for model selection on any type of data distribution, unlike Bayesian and standard regression techniques, that "fail" in some cases. We show that our tool is capable of representing and combining Bayesian regression, standard regression, and likelihood-based calibration techniques in a single framework while being able to generalize aspects of these methods. This tool also offers new insights into the interpretation of the predictive envelopes in Bayesian regression, standard regression, and likelihood-based methods while giving the analyst more control over these envelopes. / by Tony Tohme. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
147

Meta-modeling and Optimization of Computational Fluid Dynamics (CFD) analysis in thermal comfort for energy-efficient Chilled Beams-based Heating, Ventilation and Air-Conditioning (HVAC) systems

Ghanta, Nikhilesh. January 2020 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, May, 2020 / Cataloged from the official PDF of thesis. / Includes bibliographical references (pages 172-178). / With the rapid rise in the use of air conditioning systems and technological advancements, there is an ever-increasing need for optimizing the HVAC systems for energy efficiency while maintaining adequate occupant thermal comfort. HVAC systems in buildings alone contribute to almost 15% of the overall energy consumption across all sectors in the world and optimizing this would contribute positively towards overcoming climate change and reducing the global carbon footprint. A relatively modern solution is to implement a smart building-based control system and one of the objectives of this study is to understand the physical phenomenon associated with workspaces conditioned by chilled beams and evaluated the methods to reduce energy consumption. / Building upon the initial work aimed at creating a workflow for a smart building, this thesis presents the results of both experimental and computational studies of occupant thermal comfort with chilled beams (primarily in conference rooms) and the various inefficiencies associated. Results from these studies have helped to inform an optimum location for the installation of a chilled beam to counter the effects of incoming solar irradiation through an external window while keeping the energy consumption low. A detailed understanding of the various parameters influencing the temperature distribution in a room with chilled beams is achieved using CFD studies and data analysis of experimental data logging. / The work converges into a fundamental question of where, how, and what to measure to best monitor and control the human thermal comfort, and a novel technique was presented using the existing sensors which would provide a significant improvement over other existing methods in practice. This technique was validated using a series of experiments. The thesis concludes by presenting early works on hybrid HVAC systems including chilled beams and ceiling fans for higher economic gains. Future work should seek to perform CFD simulations for a better understanding of hybrid HVAC systems, both in conference rooms and open-plan office spaces, and also to design a new sensor that could better estimate human thermal comfort. / by Nikhilesh Ghanta. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
148

Modeling exascale data generation and storage for the large hadron collider computing network

Massaro, Evan K. January 2020 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, May, 2020 / Cataloged from the official PDF of thesis. / Includes bibliographical references (pages 85-86). / The Large Hadron Collider (LHC) is the world's largest and highest energy particle accelerator. With the particle collisions produced at the LHC and measured with the Compact Muon Solenoid (CMS) detector, the CMS experimental group performs precision measurements and general searches for new physics. Year-round CMS operations produce 100 Petabytes of physics data per year, which is stored within a globally distributed grid network of 70 scientific institutions. By 2027, upgrades to the LHC and CMS detector will allow unprecedented probes of microscopic physics, but in doing so generate 2,000 Petabytes (2 Exabytes) of physics data per year. To address the computational requirements of CMS, the cost of CPU resources, disk and tape storage, and tape drives were modeled. These resources were then used in a model of the major CMS computing processes and required infrastructure. / In addition to estimating budget requirements, this model produced bandwidth requirements, for which the transatlantic network cable was explicitly addressed. Given discrete or continuously parameterized policy decisions, the system cost and required network bandwidth could be modeled as a function of the policy. This sensitivity analysis was coupled to an uncertainty quantification of the model outputs, which were functions of the estimated system parameters. The expected value of the system cost and maximum transatlantic network activity were modeled to increase 40 times in 2027 relative to 2018. In 2027 the required transatlantic network capacity was modeled to have an expected value of 210 Gbps, with a 95% confidence interval that reaches 330 Gbps, just under the current bandwidth of 340 Gbps. By changing specific computing policies, the system cost and network load were shown to decrease. / Specific policies can reduce the network load to an expected value of 150 Gbps, with a 95% confidence interval that reaches 260 Gbps. Given the unprecedented volume of data, such policy changes can allow CMS to meet its future physics goals. / by Evan K. Massaro. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
149

Short-to-medium term dengue forecast in Singapore

Ong, Gin Kaijing. January 2020 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, May, 2020 / Cataloged from the official PDF of thesis. / Includes bibliographical references (pages 103-116). / Dengue fever is currently the fastest spreading mosquito-borne disease in the world, and a substantial public health problem due to its geographic spread, intensity, and resulting economic impact. The disease is endemic in Singapore, and has caused multiple outbreaks of unprecedented magnitude in the last two decades. Without a specific antiviral agent or a safe, effective and affordable vaccine for the disease, vector control remains the most effective way to control dengue transmission. The objective of this thesis is to develop spatially resolved accurate short-to-medium term dengue forecast systems, informed by mechanistic understanding of dengue transmission from previous field studies. Such systems could improve our understanding of factors that influence the transmission of dengue fever in Singapore, and potentially be used by government agencies for the planning of targeted vector control measures. / Data on dengue persistence, housing types, rainfall and seasonality was used to predict weekly dengue incidence at a housing level (i.e. dengue incidence in the high-rise and low-rise subzone groups) and at a residential subzone level. For both spatial resolutions, a separate multiple linear regression submodel was constructed for each forecast horizon of 1 to 12 weeks. Our housing-level model was able to achieve good predictions up to 6 weeks in advance, with predictive R² greater than 0.5 and total explained variance greater than 60%, but our subzone-level model was not as successful. At a housing level, we found that rainfall, housing type and seasonality predictors became relatively more important at longer forecast horizons. We also found that increased rainfall months before implies lower dengue incidence, and that rainfall influences seasonal variability in dengue incidence to a large degree. / The low-rise subzone group was also associated with higher dengue incidence than the high-rise subzone group. These findings support hypotheses from previous field studies on the roles of rainfall and urban hydrology in shaping the spatiotemporal distribution of dengue in Singapore. These risk factors of dengue could be included in current operational forecast systems to improve their predictive performance. / by Gin Kaijing Ong. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program
150

Contributions to automatic meshing in the AMORE scheme

Foo, Angus. January 2020 (has links)
Thesis: S.M., Massachusetts Institute of Technology, Computation for Design and Optimization Program, February, 2020 / Manuscript. / Includes bibliographical references (pages 75-78). / Traditional Finite Element (FE) analysis requires the discretisation of continuous bodies into connected meshes of triangles and quadrilaterals (in 2D; tetrahedrals [tet] and hexahedrals [hex] in 3D) elements. Besides the restrictions due to compatibility of adjacent elements, one primary concern regarding mesh generation is that of minimizing the distortion of elements and the number of distorted elements so as to reduce the discretisation error. This has generally steered research in 2D mesh generation techniques away from grid-based methods, which tends to generate significant numbers of distorted elements; additionally, such methods are generally not considered at all in 3D mesh generation. Furthermore, significant amounts of man-hours are used during the meshing phase of FE analyses to partition and prescribe element types, where the ability to mesh portions of the geometry with hex elements is preferred over using tet elements in the mesh. The recent advances in the theory of Overlapping Finite Elements (OFE) now allow for the use of distorted elements without compromising on the accuracy of the FE analysis. However, a trade-off arises because more degrees of freedom (DOFs) are required at triangular (and tetrahedral) nodes. We propose the reintroduction of optimised 2D grid-based mesh generation techniques to decrease the DOFs in a way that is generalizable to arbitrary 3D geometries, as part of a step towards a truly automated meshing paradigm, referred to as the Automatic Meshing with Overlapping and Regular Elements (AMORE), which requires minimal-to-no input from the engineer. / by Angus Foo. / S.M. / S.M. Massachusetts Institute of Technology, Computation for Design and Optimization Program

Page generated in 0.1144 seconds