11 |
Satellite Remote Sensing for the Assessment of Protected Areas: A Global ApplicationChisholm, Sarah Patricia 08 February 2022 (has links)
Unprecedented rates of modern species extinction present a serious challenge in the field of conservation biology. While protected areas (PAs) are regarded as key tools to reduce rates of biodiversity loss, it is unclear to what degree PAs can maintain their ecological integrity while experiencing external pressures from outside of their boundaries. Satellite remote sensing essential biodiversity variables (SRS-EBVs) are indicators of biodiversity that can be produced with large spatial coverages and can be used to measure PAs’ capacity to preserve important ecological elements for biodiversity. In this study, I used SRS-EBVs representative of ecosystem structure and function, including productivity, disturbance regimes, ecosystem extent, and ecosystem composition. I tested if PAs preserved these determinants of species survival through time, whether any changes in these variables in PAs were independent of changes in their surrounding areas (buffer zones), and if the management type of PAs influenced either of these patterns. I found that PAs maintained elements of ecosystem structure, including habitat heterogeneity and extent, inside of their boundaries, regardless of changes that occurred in their surroundings. In contrast, PAs were less effective at sustaining elements of ecosystem function and mitigating other forms of human disturbance. Productivity within PAs was the same as that of their surroundings, underscoring the inability of PAs to track shifts in climate regimes that put some species at greater risk of extinction. Fire disturbance trends were maintained across PA boundaries; however, the causes of these fires are unknown, highlighting the importance of supplemental fire census data to tease apart the trends of natural fire regimes compared to harmful burns. Finally, other human pressures thought to be the indirect effects of linear transportation features (ex. edge effects from roads) were observed to have spilled over from buffer zones into PAs. Planning for future development of the global PA network can benefit greatly from the application of SRS-EBVs. Pairing these data products with foundational ecological conservation principles can build a stronger, more efficient PA network for the preservation of Earth’s species.
|
12 |
An error methodology based on surface observations to compute the top of the atmosphere, clear-sky shortwave flux model errorsAnantharaj, Valentine (Valentine Gunasekaran) 01 May 2010 (has links)
Global Climate Models (GCMs) are indispensable tools for modeling climate change projections. Due to approximations, errors are introduced in the GCM computations of atmospheric radiation. The existing methodologies for the comparison of the GCM-computed shortwave fluxes (SWF) exiting the top of the atmosphere (TOA) against satellite observations do not separate the model errors in terms of the atmospheric and surface components. A new methodology has been developed for estimating the GCM systematic errors in the SWF at the TOA under clear-sky (CS) conditions. The new methodology is based on physical principles and utilizes in-situ measurements of SWF at the surface. This error adjustment methodology (EAM) has been validated by comparing GCM results against satellite measurements from the Clouds and the Earth’s Radiant Energy System (CERES) mission. The EAM was implemented in an error estimation model for solar radiation (EEMSR), and then applied to examine the hypothesis that the Community Climate System Model (CCSM), one of the most widely used GCMs, was deficient in representing the annual phenology of vegetation in many areas, and that satellite measurements of vegetation characteristics offered the means to rectify the problem. The CCSM computed monthly climatologies of TOA-CS-SWF were compared to the CERES climatology. The incorporation of satellite-derived land surface parameters improved the TOA SWF in many regions. However, for more meaningful interpretations of the comparisons, it was necessary to account for the uncertainties arising from the radiation calculations of CCSM. In-situ measurements from two sites were used by EMBC to relate the observations and model estimates via a predictive equation to derive the errors in TOA CS-SWF for monthly climatologies. The model climatologies were adjusted using the computed error and then compared to CERES climatology at the two sites. The new results showed that at one of the sites, CCSM consistently overestimated the atmospheric transmissivity whereas at the other site the CCSM overestimated during the spring, summer and early fall and underestimated during late fall and winter. The bias adjustment using the EMBC helped determine more clearly that at the two sites the utilization of satellite-derived land surface parameters improved the TOA CS-SWF.
|
13 |
The role of statistical distributions in vulnerability to poverty analysisPoghosyan, Armine 11 April 2024 (has links)
In regions characterized by semi-arid climates where households’ welfare primarily relies on rainfed agricultural activities, extreme weather events such as droughts can present existential challenges to their livelihoods. To mitigate these risks, numerous social protection programs have been established to assist vulnerable households affected by weather events. Despite efforts to monitor environmental changes through remotely sensed technology, estimating the impact of weather variability on livelihoods remains challenging. This is compounded by the need to select appropriate statistical distribution for weather anomaly measures and household characteristics. We address these challenges by analyzing household consumption data from the Living Standards Measurement Study survey in Niger and systematically evaluating how each input factor affects vulnerability estimates. Our findings show that the choice of statistical distribution can significantly alter outcomes. For instance, using alternative statistical distribution for vegetation index readings could lead to differences of up to 0.7%, which means around 150,000 more households might be misclassified as not vulnerable. Similarly, variations in household characteristics could result in differences of up to 10 percentage points, equivalent to approximately 2 million households. Understanding these sensitivities helps policymakers refine targeting and intervention strategies effectively. By tailoring assistance programs more precisely to the needs of vulnerable households, policymakers ensure that resources are directed where they can make the most impact in lessening the adverse effects of extreme weather events. This enhances the resilience of communities in semi-arid regions. / Master of Science / In drought-prone regions where many families rely on rainfed farming, extreme weather can devastate livelihoods. Governments have created aid programs to assist the most vulnerable households during these climate crises, but identifying who needs help is extremely challenging. Part of this difficulty lies in selecting the right statistical methods for analyzing weather data and household information. In this paper, we focus on Niger, a country that experiences frequent droughts and where over 80% of the population depends on rainfed agriculture. By evaluating household consumption data, we aim to assist in identifying the households who has high probability of becoming poor as a result of unfavorable weather events and thus needs support from social protection programs. In our analysis, we systematically evaluate how each input factor (including household characteristics and statistical distributions) affects households likelihood of becoming poor in the event of weather crises. We find that compared to alternative statistical distributions, using a conventional normal distribution could lead to misclassifying around 150,000 households as non-vulnerable, leaving them without vital assistance. Similarly, using different sets of household characteristics can result in up to 10 percentage points which equivalents to 2 million households that would miss out on much-needed support. Understanding these sensitivities is crucial for policymakers in refining how aid programs identify the vulnerable populations and include them into the protection programs. The improved targeting approach will enhance the resilience of communities in semi-arid regions facing increasing weather variability.
|
14 |
Land Cover of Virginia From Landsat Thematic Mapper ImageryMorton, David Dean 17 August 1998 (has links)
Knowledge of land cover is important in a variety of natural resources applications. This knowledge becomes more powerful within the spatial analysis capabilities of a geographic information system (GIS). This thesis presents a digital land cover map of Virginia, produced through interpretation of 14 Landsat Thematic Mapper (TM) scenes, circa 1991-1993.
The land cover map, which has a 30m pixel size, was produced entirely with personal computers. Hypercluster aggregation, an unsupervised classification method, was used when hazy and mountainous conditions were not present. A haze correction procedure by Lavreau (1991) was used, followed by a supervised classification on coastal areas. An enhanced supervised classification, focusing on topographic shading, was performed in the mountains. Color infrared photographs, digital maplets, expert knowledge, and other maps were used as training data. Aerial videography transects were flown to acquire reference data.
Due to the spatial inaccuracies inherent in the videography reference data, only homogeneous land cover areas were used in the accuracy assessment. The results of the overall accuracy for each scene determined the ordering of scenes within the statewide land cover mosaic (i.e., scenes with higher accuracy had a higher proportion of area represented). An accuracy assessment was then performed on the statewide land cover mosaic. An overall accuracy of 81.8% and a Kappa statistic of 0.81 resulted. A discussion of potential reasons for land cover class confusion and suggestions for classification improvements are presented.
Overall deciduous forest was the most common land cover in Virginia. Herbaceous areas accounted for 20% of the land area, which was the second largest. Mixed forest and coastal wetlands were the cover types with the least area, each under 3%. / Master of Science
|
15 |
Quantitative Analysis of Commodity Markets, Household Vulnerability, and Learning OutcomesPoghosyan, Armine 21 August 2024 (has links)
Chapter 1 examines alternative specifications of futures-based forecasting models to improve upon existing approaches constrained by restrictive assumptions and limited information sets. We replace historical averages with rolling regressions and incorporate current market information through the deviation of the current basis from its historical average. To address potential non-stationarity and structural changes in the cash-futures price relationship, we employ a five-year rolling estimation window. Our findings indicate that the rolling regression approach yields significant improvements in both accuracy and information content of cotton season-average price forecasts, primarily at short forecast horizons.
Chapter 2 addresses challenges in vulnerability assessment for semi-arid regions dependent on rainfed agriculture, where extreme weather events pose significant risks to household livelihoods. Despite advancements in remotely sensed technology, accurately estimating weather variability's impact on household livelihoods remains challenging. This study evaluates the effects of weather anomaly measures, spatial resolutions (i.e., geographic level at which the weather anomaly measures are evaluated), and household characteristics on household likelihood of falling into poverty (i.e., vulnerability) estimates. Combining household consumption data for Niger with remotely sensed agro-environmental measures, we find significant variations in vulnerability estimates based on the use of various weather condition measures (3 percentage points, equivalent to 600,000 households), spatial resolutions (8 percentage points, totalling 1.6 million households), and household characteristics (10 percentage points, equivalent to approximately 2 million households).
Chapter 3 evaluates student learning outcomes from student involvement in hands-on learning settings, specifically focusing on student-managed investment funds. To assess the changes in the obtained technical and practical skills, we combine knowledge tests with grading rubrics. As part of practical skills, we consider commodity market analysis, critical thinking, informed decision-making, and insightful interpretation of market analysis results. We evaluate our students' understanding of commodity markets and their practical trading skills before and after joining the student-managed investment fund program. We find significant improvements in student learning outcomes, with students showing an average increase of 28% in disciplinary or technical knowledge and 38% in practical skills. Our findings highlight the importance of hands-on learning experiences to bridge the gap between theoretical knowledge and real-world application and in developing the well-rounded skill set demanded by the job market. / Doctor of Philosophy / Chapter 1 explores several alternative specifications of futures-based forecasting models to improve existing approaches constrained by restrictive assumptions and limited information sets. Accurate prediction of cotton prices is vital for the agricultural sector, significantly impacting decisions made by farmers, traders, and policymakers. Reliable forecasts enable farmers to optimize their planting and harvesting strategies, allow traders to manage risk more effectively, and guide policymakers in developing informed agricultural policies. However, the inherent volatility of commodity markets, particularly cotton, presents substantial challenges to price forecasting. Traditional forecasting methods often struggle to capture rapid market changes, resulting in less reliable predictions. Our proposed more responsive forecasting approaches lead to a significant gain in accuracy and information content of cotton price projection and provide valuable insights that can enhance decision-making processes throughout the cotton industry.
Chapter 2 explores how extreme weather events, like droughts, affect households in semi-arid regions where people's livelihood largely depends on rain-fed farming. While satellite technology helps monitor environmental changes, it is still challenging to accurately measure how weather changes impact people's lives. Our study focuses on Niger and uses household survey data to assess how various factors influence our understanding of the risk of falling into poverty (i.e., household vulnerability) due to adverse weather events. We found that the methods we use to measure weather conditions, the geographic scale at which we measure them, and the household information we include can all significantly alter our estimates of how many households are at risk of becoming poor. For example, different methods for measuring weather impacts can change estimates of household vulnerability by about 3 percentage points, affecting around 600,000 households. The geographic level (administrative unit level or within a 20 km buffer around an enumeration area) at which we assess weather conditions can shift our estimates by 8 percentage points, which is equivalent to 1.6 million households. Additionally, considering different household characteristics can change our estimates by 10 percentage points, impacting around 2 million households. Our findings are crucial for policymakers who aim to better understand and address the effects of weather on vulnerable communities.
Chapter 3 evaluates student learning outcomes from participation in the Commodity Investing by Students program, a student-managed investment fund within the Department of Agricultural and Applied Economics at Virginia Tech. Our study focuses on students from the 2022/23 and 2023/24 academic years, assessing both their technical knowledge and practical skills gained during a year-long involvement in the program. To measure changes in technical skills, we administered knowledge-testing quizzes before and after the training class. Practical skills, such as commodity market analysis, critical thinking, informed decision-making, and insightful interpretation of market analysis results, we evaluated through trading projects submitted during and at the end of the training class. We grade these student submissions using a specific practical skill evaluation rubric. We find significant improvements in student learning outcomes. On average, students demonstrated a 28% increase in disciplinary knowledge and a 38% improvement in practical skills. Our findings highlight the effectiveness of hands-on learning in improving both technical knowledge and practical skills that are highly valued in today's job market.
|
16 |
Predicting floods from space: a case study of Puerto RicoEmigh, Anthony James 01 May 2019 (has links)
Floods are a significant threat to communities around the world and require substantial resources and infrastructure to predict. Limited local resources in developing nations make it difficult to build and maintain dense sensor networks like those present in the United States, creating a large disparity in flood prediction across borders. To address this disparity, I operated the Iowa Flood Center Top Layer model to predict floods in Puerto Rico without relying on in-situ data measurements. Instead, all model forcing was provided by satellite remote sensing datasets that offer near-global coverage.
I used three datasets gathered via satellite remote sensing to build and operate watershed streamflow models: elevation data obtained by the Space Shuttle Endeavour through the Shuttle Radar Topography Mission (SRTM), rainfall estimates gathered by a constellation of satellites through the Global Precipitation Measurement Mission (GPM), and evapotranspiration rate estimates collected by Moderate Resolution Imaging Spectroradiometer (MODIS) sensors aboard the Aqua and Terra satellites. While these satellite remote sensing datasets make observations of nearly the entire world, their spatiotemporal resolution is coarse compared to conventional on-the-ground measurements.
Hydrologic models were assembled for 75 basins upstream of streamflow gages monitored by the United States Geologic Survey (USGS). Model simulations were compared to real-time measurements at these gages. Continuous simulations spanning 58 months achieve poor Nash Sutcliffe Efficiency and Klinge Gupta Efficiency of -112.0 and -0.5, respectively. The sources of error that influence model performance were investigated, underlining some limitations of relying solely on satellite data for operational flood prediction efforts.
|
17 |
INTERCOMPARISON OF METHODS TO APPLY SATELLITE OBSERVATIONS FOR INVERSE MODELLING OF NOx SURFACE EMISSIONSPadmanabhan, Akhila L. 03 September 2013 (has links)
Knowledge of NOx (NO2 + NO) emissions is useful to understand processes affecting air quality and climate change. Emission inventories of surface NOx have high uncertainties. Satellite remote sensing has enabled measurements of trace gases in the atmosphere over a large regional and temporal scale. Inverse modeling of NO2 observations from satellites can be used to improve existing emissions inventories. This study seeks to understand the difference in two methods of inverse modeling: the mass balance approach and the adjoint approach using the GEOS-Chem chemical transport model and its adjoint. Using both synthetic satellite observations and those derived from the SCIAMACHY satellite instrument, this paper found that the performance of these two inversions was affected by pixel smearing and observational error. Smearing reduced the accuracy of the mass balance approach, while high observational error reduced the accuracy of the adjoint approach. However, both approaches improved the a priori emissions estimate.
|
18 |
Ice dynamics and mass balance in the grounding zone of outlet glaciers in the Transantarctic MountainsMarsh, Oliver John January 2013 (has links)
The Antarctic grounding zone has a disproportionately large effect on glacier dynamics and ice sheet stability relative to its size but remains poorly characterised across much of the continent. Accurate ice velocity and thickness information is needed in the grounding zone to determine glacier outflow and establish to what extent changing ocean and atmospheric conditions are affecting the mass balance of individual glacier catchments.
This thesis describes new satellite remote sensing techniques for measuring ice velocity and ice thickness, validated using ground measurements collected on the Beardmore, Skelton and Darwin Glaciers and applied to other Transantarctic Mountain
outlet glaciers to determine ice discharge. Outlet glaciers in the Transantarctic Mountains provide an important link between the East and West Antarctic Ice Sheets but remain inadequately studied. While long-term velocities in this region
are shown here to be stable, instantaneous velocities are sensitive to stresses induced by ocean tides, with fluctuations of up to 50% of the mean observed in GPS measurements. The potential error induced in averaged satellite velocity measurements due to these effects is shown to be resolvable above background noise in the grounding zone but to decrease rapidly upstream. Using a new inverse finite-element modelling
approach based on regularization of the elastic-plate bending equations, tidal flexure information from differential InSAR is used to calculate ice stiffness and infer thickness in the grounding zone. This technique is shown to be successful at reproducing the thickness distribution for the Beardmore Glacier, eliminating current issues in the calculation of thickness from freeboard close to the grounding line where ice is not in hydrostatic equilibrium. Modelled thickness agrees to within 10% of ground penetrating radar measurements. Calibrated freeboard measurements and tide-free velocities in the grounding zones of glaciers in the western Ross Sea are used to calculate grounding zone basal melt rates, with values between 1.4 and 11.8 m/a⁻¹ in this region. While strongly dependent on grounding line ice thickness and velocity, melt rates show no latitudinal trend between glaciers, although detailed error analysis highlights the need for much improved estimates of firn density distribution in regions of variable accumulation such as the Transantarctic Mountains.
|
19 |
Flaring and pollution detection in the Niger Delta using remote sensingMorakinyo, Barnabas Ojo January 2015 (has links)
Through the Global Gas Flaring Reduction (GGFR) initiative a substantial amount of effort and international attention has been focused on the reduction of gas flaring since 2002 (Elvidge et al., 2009). Nigeria is rated as the second country in the world for gas flaring, after Russia. In an attempt to reduce and eliminate gas flaring the federal government of Nigeria has implemented a number of gas flaring reduction projects, but poor governmental regulatory policies have been mostly unsuccessful in phasing it out. This study examines the effects of pollution from gas flaring using multiple satellite based sensors (Landsat 5 TM and Landsat 7 ETM+) with a focus on vegetation health in the Niger Delta. Over 131 flaring sites in all 9 states (Abia, Akwa Ibom, Bayelsa, Cross Rivers, Delta, Edo, Imo, Ondo and Rivers) of the Niger Delta region have been identified, out of which 11 sites in Rivers State were examined using a case study approach. Land Surface Temperature data were derived using a novel procedure drawing in visible band information to mask out clouds and identify appropriate emissivity values for different land cover types. In 2503 out of 3001 Landsat subscenes analysed, Land Surface Temperature was elevated by at least 1 ℃ within 450 m of the flare. The results from fieldwork, carried out at the Eleme Refinery II Petroleum Company and Onne Flow Station, are compared to the Landsat 5 TM and Landsat 7 ETM+ data. Results indicate that Landsat data can detect gas flares and their associated pollution on vegetation health with acceptable accuracy for both Land Surface Temperature (range: 0.120 to 1.907 K) and Normalized Differential Vegetation Index (sd ± 0.004). Available environmental factors such as size of facility, height of stack, and time were considered. Finally, the assessment of the impact of pollution on a time series analysis (1984 to 2013) of vegetation health shows a decrease in NDVI annually within 120 m from the flare and that the spatio-temporal variability of NDVI for each site is influenced by local factors. This research demonstrated that only 5 % of the variability in δLST and only 12 % of the variability in δNDVI, with distance from the flare stack, could be accounted for by the available variables considered in this study. This suggests that other missing factors (the gas flaring volume and vegetation speciation) play a significant role in the variability in δLST and δNDVI respectively.
|
20 |
Putting it all together: Geophysical data integrationKvamme, Kenneth L., Ernenwein, Eileen G., Menzer, Jeremy G. 01 January 2018 (has links)
The integration of information from multiple geophysical and other prospection surveys of archaeological sites and regions leads to a richer and more complete understanding of subsurface content, structure, and physical relationships. Such fusions of information occur within a single geophysical data set or between two or more geophysical and other prospection sources in one, two, or three dimensions. An absolute requirement is the accurate coregistration of all information to the same coordinate space. Data integrations occur at two levels. At the feature level, discrete objects that denote archaeological features are defined, usually subjectively, through the manual digitization of features interpreted in the data, although there is growing interest in automated feature identification and extraction. At the pixel level, distributional issues of skewness and outliers, high levels of noise that obfuscate targets of interest, and a lack of correlation between largely independent dimensions must be confronted. Nevertheless, successful fusions occur using computer graphic methods, simple arithmetic combinations, and advanced multivariate methods, including principal components analysis and supervised and unsupervised classifications. Four case studies are presented that illustrate some of these approaches and offer advancement into new domains.
|
Page generated in 0.1015 seconds