• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 324
  • Tagged with
  • 324
  • 324
  • 324
  • 324
  • 8
  • 8
  • 6
  • 6
  • 6
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Laboratory Evaluation of In-Situ Tests as Potential Quality Control/Quality Assurance Tools

Seyman, Ekrem 04 September 2003 (has links)
There are new in-situ test devices such as the Geogauge, Light Falling Weight Deflectometer (LFWD) and Dynamic Cone Penetrometer (DCP). Unlike the nuclear density gauge, the new methods provide measurements based on the engineering properties (strength/stiffness) of soil instead of physical properties like field density and moisture content. However the geogauge, LFWD and the DCP are not yet proven to be reliable and the correlations of these tests with standard tests are limited. An extensive laboratory investigation was carried out to evaluate the Geogauge, LFWD and DCP as potential tests to measure in-situ stiffness of highway materials and embankments. In this study, test layers were prepared in two boxes that measure 5 ft length x 3 ft width x 2 ft depth at Louisiana Transportation Research Center (LTRC) Geosynthetic Engineering Research Laboratory (GERL). The results from a series of laboratory tests on embankment soils and base course materials were used to correlate Geogauge, LFWD, Dynamic Cone Penetrometer (DCP) measurements with the Plate Load Test (PLT) and California Bearing Ratio (CBR). There is good correlation between the LFWD dynamic modulus and PLT elastic modulus. The LFWD is a better alternative for static PLT compared to the Geogauge. Although LFWD is a dynamic test, the similarity in depth of influence with the PLT and the quality of developed correlations suggests that the LFWD has better potential to replace the PLT. There is no significant correlation between the LFWD and the CBR test. The Geogauge and the DCP correlates better with the CBR and DCP is already proven to be an effective tool to estimate in-situ CBR. Based on the developed correlations and laboratory experience, it was found that the investigated devices have the potential to measure in-situ stiffness of highway materials and embankments.
72

Field Evaluation of in-Situ Test Technology for QC/QA during Construction of Pavement Layers and Embankments

Nazzal, Munir Darwish 10 October 2003 (has links)
With the coming changes from an empirical to mechanistic-empirical pavement design, it becomes essential to move towards changing the quality control/quality assurance (QC/QA) procedures of compacted materials from a unit weight-based criterion to a stiffness/strength based criterion. The non-destructive in-situ tests such as Geogauge, Dynamic Cone Penetrometer (DCP), and Light Falling Weight Deflectometer (LFWD) can be used as effective tools in the assessment of subsurface conditions and in evaluating the stiffness of pavement materials and embankment. This thesis evaluates the potential use of these three devices to reliably measure the stiffness characteristics of highway materials for possible application in the QC/QA procedures during and after the construction of pavement layers and embankments. To achieve this, field tests were conducted on highway sections selected from different projects in Louisiana State. In addition, six test sections and three trench sections were constructed and tested at the LTRC Accelerated Load Facility (ALF) site for testing. The field tests included conducting Geogauge, LFWD, DCP tests and standard tests such as the Plate Load Test (PLT) and Falling Weight Deflectometer (FWD) test. The California Bearing Ratio (CBR) laboratory tests were also conducted on samples collected during field tests. Statistical analysis was conducted to correlate the measurements obtained from the three investigated devices and those obtained from the standard tests. Good correlations were obtained between the measurements of the investigated devices and the standard tests. Laboratory tests were also conducted to evaluate the influence depth of the Geogauge and LFWD devices. The results of laboratory tests indicated that the average influence depth for the Geogauge and LFWD devices are about 200 mm and 280 mm, respectively.
73

Strength Properties of Granular Materials

Novoa-Martinez, Brenda 11 July 2003 (has links)
This thesis presents the results of the experimental work conducted on glass beads in order to investigate the effects of particle size, confining pressure, and surface roughness on the strength properties of the particulate media. Conventional triaxial compression tests were conducted to investigate those effects. Three different sizes of beads were tested: small (diameter = 0.75 1.00 mm), medium (diameter = 1.55 1.85 mm), and large (diameter = 3.30 3.60 mm). The glass beads were subjected to three different confining pressures: 25-, 100-, and 400-kPa. Smooth and etched beads were tested; the etched surface was achieved by submerging the beads in a bath of Hydrofluoric acid. It was found that as the confining pressure increases, the peak stress ratio decreases. Also, it was found that an increase in roughness produces an increase in the peak friction angle. The particle size was found to affect the stress-strain and volumetric strain behavior of the beads; however, a specific trend was not found.
74

Staying Afloat: A Risk Analysis Study of Flooding in South Louisiana

Tomaszkiewicz, Marlene 11 November 2003 (has links)
For decades, South Louisiana has been battling with flood risk. The area has been inundated with flood waters several times, whether it is due to tropical weather, strong rainfall events, or rising waters. This has resulted in millions of dollars in damages, and in some instances, the same home has been affected multiple times. Many agree that South Louisiana is just not a livable area to begin with; the risk is just too great. However, abandoning the area is simply not an option. As a result, the flood risk battle ensues. How much risk are we talking about? The Federal Emergency Management Agency maintains a set of flood maps showing which areas have been determined to be within the 100-year floodplain. However, these maps are not always very accurate. Using a geographical information system, one can compare locations of past flood claims against the flood zone maps. Additionally, hydrologic models from previous storms occurring in the watershed can be thrown into the GIS mix. This study will determine true risk to communities within South Louisiana by comparing existing flood zone maps against locations of past flood claims and against hydrologic models.
75

A Methodology for Deriving Performance Measures from Spatio-Temporal Traffic Contour Maps Using Digital Image Analysis Procedures

Kotha, Prashanth 12 November 2003 (has links)
The main focus of this study is to improve the data analysis tools used in performance monitoring and level of service assessment of freeway systems. The proposed study presents a methodology to develop new second-order statistical measures that are derived from texture characterization techniques in the field of digital image analysis. The new measures are capable of extracting properties such as smoothness, homogeneity, regularity, and randomness in traffic behavior from the spatio-temporal traffic contour maps. To study the new performance measures a total of 14270, 15-min traffic contour maps were generated for a section of 3.4 miles of I-4 in Orlando, Florida for 24 hours over a period of 5 weekdays. A correlation matrix was examined using the obtained measures for all the constructed maps, which is used to check for information redundancy. This resulted in retaining a set of three second-order statistical measures: angular second moment (ASM), contrast (CON), and entropy (ENT). The retained measures were analyzed to examine their sensitivity to various traffic conditions, expressed by the overall mean speed of each contour map. The measures were also used to evaluate level of service for each contour map. The sensitivity analysis and level of service criteria can be implemented in real time using a stand-alone module that was developed in this study. The study also presents a methodology to compare the traffic characteristics of various congested conditions. To examine the congestion characteristics, a total of 10,290 traffic contour maps were generated from a 7.5-mile section of the freeway for a period of 5 weekdays.
76

Exploring Geometric, Kinematic and Behavioral Scalability of Microscopic Traffic Simulation Systems

Chakravarthy, Srikanth 17 November 2003 (has links)
Even with today's remarkable advancement in computing power, microscopic simulation modeling remains a computationally intensive process that imposes limitations on its potential use for modeling large-scale transportation networks. Research and practice have repeatedly demonstrated that microscopic simulation runs can be excessively time-consuming, depending on the network size, the number of simulated entities (vehicles), and the computational resources available. While microscopic features of a simulated system collectively define the overall system characteristics, it is argued that the microscopic simulation process itself is not necessarily free of redundancy, which if reduced, could substantially improve the computational efficiency of simulation systems without compromising the overall integrity of the simulation process. This research study explores the concept of scalability for microscopic traffic simulation systems in order to improve their computational efficiency and cost-effectiveness. More specifically, we present an optimized downsampling procedure for transforming the full-scale simulation system (prototype) into a geometrically, kinematically, and behaviorally equivalent reduced-scale system (microcosm). The ultimate goal is to execute the microscopic simulation process in the microcosm environment, observe all necessary macroscopic characteristics and performance measures, and upsample the results back to the prototype environment. Experimental analysis was conducted on a homogeneous freeway corridor to examine the effect of different operating conditions on the optimal solutions for the downsampling procedure. The study also investigates the tradeoff between performance and scalability of microscopic simulation systems.
77

Assessing the Feasibility of Supplying Vehicle Activity Data to MOBILE6 Using the Global Positioning System (GPS)

Varanasi, Srinivas 13 November 2003 (has links)
MOBILE6 is a software program developed by the U.S. Environmental Protection Agency (EPA) to estimate current and future vehicle emissions. The EPA requires all states to develop emission inventories for State Implementation Plans (SIPs) and to demonstrate conformity in non-attainment areas. The primary objective of the MOBILE models is to develop emission inventories in compliance with EPAs regulations. The research reported in this study is primarily a proof-of-concept study of the feasibility of the Global Positioning System (GPS) to supply vehicle activity data to MOBILE6. However, it also deals with identifying vehicle registration input to MOBILE6 using non-GPS sources. Input data obtained from GPS and non-GPS sources are graphically and statistically compared with the EPAs national default values.
78

Evaluation of Static Low Density Media Filters for Use in Domestic Wastewater Treatment

Wagener, Cynthia 14 November 2003 (has links)
Static Low Density Media (SLDM) filters are submerged granular medium filters that contain a static matrix of floating media. These filters provide concurrent biological and physical treatment, and are therefore classified as bioclarifiers. Through different design and operation strategies, SLDM filters may be used for a variety of functions such as: solid-liquid separation alone, organic conversion and solids capture, nitrification and solids capture, and denitrification and solids capture. For operation as an aerobic unit, an external aeration strategy was developed to preserve the static nature of the bed. In this study, SLDM filters treated a highly variable flow domestic wastewater generated from an industrial facility in Denham Springs, Louisiana. Various bench scale filter configurations were evaluated on their ability to perform both biological and physical treatment at a variety of hydraulic filtration rates, backwash frequencies, and configurations, while constantly keeping the filter bed in an aerobic state. Data collected from units recirculated via airlift pumps is reported. The pneumatically washed units in this study employed a modified shape media and a high backwashing frequency to enhance biofiltration capacities. Units were fed primary domestic wastewater effluents with mean CBOD5 (carbonaceous biochemical oxygen demand) of 100 and 150 mg/L. Mean influent TSS (total suspended solids) values for the two units tested were 60 and 90 mg/L. The airlift/SLDM filter combination was able to maintain mean hydraulic filtration rates in the range of 10-15 m/hr. Findings indicate the unit is capable of producing CBOD5 and TSS effluent qualities in the 10-20 mg/l range when subject to organic loadings between 1 and 3.5 kg/m3.day. These values are higher than reported loading capacities for conventional secondary wastewater treatment strategies, such as Activated Sludge units, Trickling Filters, and Biological Aerated Filters. In this study, effluent CBOD5 levels were closely correlated with effluent TSS levels. Although no problems with media caking were observed, at times poor backwash interval selection did lead to severe oxygen depression within the bead bed. It is concluded that SLDM show promise for application in the domestic wastewater arena, particularly where the scale of the operation places a premium on simple operation.
79

Development of a Methodology to Delineate Hurricane Evacuation Zones

Meduri, Nandagopal 27 January 2004 (has links)
The main focus of this study is to exhibit a simple and easily implemented process to identify hurricane evacuation zones. The proposed study presents a new methodology to delineate hurricane evacuation zones. The new methodology helps in easier identification of zones, thereby making the task of evacuation officials easier. Hurricane evacuation zones were identified based on the elevation of the points comprising the study area. An area layer was created based on the storm surge models run, and were overlaid with zip code boundary layer and land use data available. Elevation point file data was superimposed on the study area layer to find out the mean elevation and standard deviation values of elevation, which help in identifying the adjacent zones which are similar in elevation. These zones are grouped together and the process of joining zones is continued until the zones are sufficiently reduced. The entire process was carried out in TransCAD, a Transportation/GIS software package. This task involves an iterative process which would be extremely tedious to perform manually. Hence, the process was automated by writing a customized add-in program to TransCAD. The study also included the application of this methodology to the New Orleans metropolitan area, it being an ideal test case for implementing the methodology that was developed. The results portray New Orleans area being ideally and conveniently divided into hurricane evacuation zones based on their elevations, zip code and storm surge data. The GIS program developed during this study provides a framework, which may be built upon and shared with other researchers in the future.
80

Dynamic Performance of Bridges and Vehicles under Strong Wind

Chen, Suren 20 February 2004 (has links)
The record of span length for flexible bridges has been broken with the development of modern materials and construction techniques. With the increase of bridge span, the dynamic response of the bridge becomes more significant under external wind action and traffic loads. The present research targets specifically on dynamic performance of bridges as well as the transportation under strong wind. The dissertation studied the coupled vibration features of bridges under strong wind. The current research proposed the modal coupling assessment technique for bridges. A closed-form spectral solution and a practical methodology are provided to predict coupled multimode vibration without actually solving the coupled equations. The modal coupling effect was then quantified using a so-called modal coupling factor (MCF). Based on the modal coupling analysis techniques, the mechanism of transition from multi-frequency type of buffeting to single-frequency type of flutter was numerically demonstrated. As a result, the transition phenomena observed from wind tunnel tests can be better understood and some confusing concepts in flutter vibrations are clarified. The framework of vehicle-bridge-wind interaction analysis model was then built. With the interaction model, the dynamic performance of vehicles and bridges under wind and road roughness input can be assessed for different vehicle numbers and different vehicle types. Based on interaction analysis results, the framework of vehicle accident analysis model was introduced. As a result, the safer vehicle transportation under wind can be expected and the service capabilities of those transportation infrastructures can be maximized. Such result is especially important for evacuation planning to potentially save lives during evacuation in hurricane-prone area. The dissertation finally studied how to improve the dynamic performance of bridges under wind. The special features of structural control with Tuned Mass Dampers (TMD) on the buffeting response under strong wind were studied. It was found that TMD can also be very efficient when wind speed is high through attenuating modal coupling effects among modes. A 3-row TMD control strategy and a moveable control strategy under hurricane conditions were then proposed to achieve better control performance.

Page generated in 0.1227 seconds