• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Tectonic smoothing and mapping

Ni, Kai 16 May 2011 (has links)
Large-scale mapping has become the key to numerous applications, e.g. simultaneous localization and mapping (SLAM) for autonomous robots. Despite of the success of many SLAM projects, there are still some challenging scenarios in which most of the current algorithms are not able to deliver an exact solution fast enough. One of these challenges is the size of SLAM problems, which has increased by several magnitudes over the last decade. Another challenge for SLAM problems is the large amount of noise baked in the measurements, which often yields poor initializations and slows or even fails the optimization. Urban 3D reconstruction is another popular application for large-scale mapping and has received considerable attention recently from the computer vision community. High-quality 3D models are useful in various successful cartographic and architectural applications, such as Google Earth or Microsoft Live Local. At the heart of urban reconstruction problems is structure from motion (SfM). Due to the wide availability of cameras, especially on handhold devices, SfM is becoming a more and more crucial technique to handle a large amount of images. In the thesis, I present a novel batch algorithm, namely Tectonic Smoothing and Mapping (TSAM). I will show that the original SLAM graph can be recursively partitioned into multiple-level submaps using the nested dissection algorithm, which leads to the cluster tree, a powerful graph representation. By employing the nested dissection algorithm, the algorithm greatly minimizes the dependencies between two subtrees, and the optimization of the original graph can be done using a bottom-up inference along the corresponding cluster tree. To speed up the computation, a base node is introduced for each submap and is used to represent the rigid transformation of the submap in the global coordinate frame. As a result, the optimization moves the base nodes rather than the actual submap variables. I will also show that TSAM can be successfully applied to the SfM problem as well, in which a hypergraph representation is employed to capture the pairwise constraints between cameras. The hierarchical partitioning based on the hypergraph not only yields a cluster tree as in the SLAM problem but also forces resulting submaps to be nonsingular. I will demonstrate the TSAM algorithm using various simulation and real-world data sets.
2

Forecasting Model for High-Speed Rail in the United States

Ramesh Chirania, Saloni 08 November 2012 (has links)
A tool to model both current rail and future high-speed rail (HSR) corridors has been presented in this work. The model is designed as an addition to the existing TSAM (Transportation System Analysis Model) capabilities of modeling commercial airline and automobile demand. TSAM is a nationwide county to county multimodal demand forecasting tool based on the classical four step process. A variation of the Box-Cox logit model is proposed to best capture the characteristic behavior of rail demand in US. The utility equation uses travel time and travel cost as the decision variables for each model. Additionally, a mode specific geographic constant is applied to the rail mode to model the North-East Corridor (NEC). NEC is of peculiar interest in modeling, as it accounts for most of the rail ridership. The coefficients are computed using Genetic Algorithms. A one county to one station assignment is employed for the station choice model. Modifications are made to the station choice model to replicate choices affected by the ease of access via driving and mass transit. The functions for time and cost inputs for the rail system were developed from the AMTRAK website. These changes and calibration coefficients are incorporated in TSAM. The TSAM model is executed for the present and future years and the predictions are discussed. Sensitivity analysis for cost and speed of the predicted HSR is shown. The model shows the market shift for different modes with the introduction of HSR. Limited data presents the most critical hindrance in improving the model further. The current validation process incorporates essential assumptions and approximations for transfer rates, short trip percentages, and access and egress distances. The challenges for the model posed by limited data are discussed in the model. / Master of Science

Page generated in 0.0197 seconds