• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 12
  • 11
  • 5
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 90
  • 21
  • 17
  • 12
  • 10
  • 10
  • 10
  • 9
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Study For Development Of A Blast Layer For The Virtual Range Project

Rosales, Sergio 01 January 2004 (has links)
In this work we develop a Blast-Propellant-Facility integrated analysis study, which evaluates, by using two different approaches, the blast-related impact of an explosive accident of the Space Shuttle during the first ten seconds after launch at Kennedy Space Center. The blast-related risk associated with an explosion at this stage is high because of the quantity of energy involved in both multiple and complex processes. To do this, one of our approaches employed BlastFX®, a software system that facilitates the estimation of the level of damage to people and buildings, starting from an explosive device and rendering results through a complete report that illustrates and facilitates the evaluation of consequences. Our other approaches employed the Hopkinson-Cranz Scaled Law for estimating similar features at a more distant distance and by evaluating bigger amounts of TNT equivalent. Specifically, we considered more than 500 m and 45,400 kg, respectively, which are the range and TNT content limits that our version of BlastFX® can cover. Much research has been done to study the explosion phenomena with respect to both solid and liquid propellants and the laws that underlie the blast waves of an explosion. Therefore our methodology is based on the foundation provided by a large set of literature review and the actual capacities of an application like BlastFX®. By using and integrating the lessons from the literature and the capabilities of the software, we have obtained very useful information for evaluating different scenarios that rely on the assumption, which is largely studied, that the blast waves' behavior is affected by the distance. All of this has been focused on the Space Shuttle system, in which propellant mass represents the source of our analysis and the core of this work. Estimating the risks involved in it and providing results based on different scenarios augments the collective knowledge of risks associated with space exploration.
42

Alternative Foam Treatments For The Space Shuttle's External Tank

Dreggors, Kirsten 01 January 2005 (has links)
The Space Shuttle Columbia accident and the recent excitement surrounding Discovery's return to space brought excessive media attention to the foam products used on the External Tank (ET). In both cases, videos showed chunks of foam or ablative material falling away from the ET during lift off. This led to several years of investigation and research into the exact cause of the accident and potential solutions to avoid the problem in the future. Several design changes were made prior to the return to flight this year, but the ET still shed foam during lift off. Since the Columbia accident, the loss of foam on ETs has been a significant area of interest for NASA, United Space Alliance, and Lockheed Martin. The Columbia Accident Investigation Board did not evaluate alternative materials but certainly highlighted the need for change. The majority of the research previously concentrated on improving the design and/or the application process of the current materials. Within recent years, some research and testing has been done to determine if a glass microsphere composite foam would be an acceptable alternative, but this work was overcome by the need for immediate change to return the shuttle to flight in time to deliver supplies to the International Space Station. Through a better understanding of the foam products currently used on the ET, other products can be evaluated for future space shuttle flights and potential applications on new space vehicles. The material properties and the required functionality of alternative materials can be compared to the current materials to determine if suitable replacement products exist. This research also lends itself to the development of future space flight and unmanned launch vehicles. In this paper, the feasibility of alternative material for the space shuttle's external tank will be investigated. Research on what products are used on the ET and a set of functional requirements driving the selection of those materials will be presented. The material properties of the current ET foam products will be collected and an evaluation of how those materials' properties meet the functional requirements will be accomplished. Then significant research on polymeric foams and ablative materials will be completed to learn how these various products can be applied in this industry. With this research and analysis, the knowledge gained will be used to select and evaluate the effectiveness of an alternate product and to determine feasibility of a product change with the current ET and the importance of maintaining the shuttle launch schedule. This research will also be used to evaluate the potential application of the alternative product on future platforms. There are several possible outcomes to this research. This research could result in a recommended change to the ET foam material or a perfectly acceptable alternative material that could result in a cost or schedule impact if implemented. It is also possible that there exists no suitable alternative material given the existing functional requirements. In any case, the alternative material could have future applications on new space vehicles. A set of results from the research and analysis will be provided along with a recommendation on a future material for use on space vehicles.
43

Evaluation Of Space Shuttle Tile Subnominal Bonds

Snapp, Cooper 01 January 2006 (has links)
This study researched the history of Space Shuttle Reusable Surface Insulation which was designed and developed for use on the United States Orbiter fleet to protect from the high heating experienced during reentry through Earth's atmosphere. Specifically the tile system which is attached to the structure by the means of an RTV adhesive has experienced situations where the bonds are identified as subnominal. The history of these subnominal conditions is presented along with a recent identification of a subnominal bond between the Strain Isolation Pad and the tile substrate itself. Tests were run to identify the cause of these subnominal conditions and also to show how these conditions were proved to be acceptable for flight. The study also goes into cases that could be used to identify subnominal conditions on tile as a non-destructive test prior to flight. Several options of non-destructive testing were identified and recommendations are given for future research into this topic. A recent topic is also discussed in the instance where gap fillers were identified during the STS-114 mission that did not properly adhere to the substrate. The gap fillers were found protruding past the Outer Mold Line of the vehicle which required an unprecedented spacewalk to remove them to allow for a safe reentry through the atmosphere.
44

Remote Sensing Approach for Hydrologic Assessments of Complex Lake Systems

Bhang, Kon Joon 04 September 2008 (has links)
No description available.
45

Validation of a System to Analyze Jump Kinetics during Musculoskeletal Rehabilitation

Sacco, John D. 11 September 2009 (has links)
No description available.
46

Managing the Risk of Failure in Complex Systems: Insight into the Space Shuttle Challenger Failure

Vantine, William L. 17 December 1998 (has links)
This dissertation presents a new approach for identifying, assessing, mitigating, and managing the risks of failure in complex systems. It describes the paradigm commonly used today to explain such failures and proposes an alternative paradigm that expands the lens for viewing failures to include alternative theories derived from modern theories of physics. Further, it describes the foundation for each paradigm and illustrates how the paradigms may be applied to a particular system failure. Today, system failure commonly is analyzed using a paradigm grounded in classical or Newtonian physics. This branch of science embraces the principles of reductionism, cause and effect, and determinism. Reductionism is used to dissect the system failure into its fundamental elements. The principle of cause and effect links the actions that led to the failure to the consequences that result. Analysts use determinism to establish the linear link from one event to another to form the chain that reveals the path from cause to consequence. As a result, each failure has a single cause and a single consequence. An alternative paradigm, labeled contemporary, incorporates the Newtonian foundation of the classical paradigm, but it does not accept the principles as inviolate. Instead, this contemporary paradigm adopts the principles found in the theories of relativity, quantum mechanics, chaos, and complexity. These theories hold that any analysis of the failure is affected by the frame of reference of the observer. Causes may create non-linear effects and these effects may not be observable directly. In this paradigm, there are assumed to be multiple causes for any system failure. Each cause contributes to the failure to a degree that may not be measurable using techniques of classical physics. The failure itself generates multiple consequences that may be remote in place or time from the site of the failure, and which may affect multiple individuals and organizations. Further, these consequences, are not inevitable, but may be altered by actions taken prior to and responses taken after the occurrence of the failure. The classical and contemporary paradigms are applied using a single embedded case study, the failure of the space shuttle Challenger. Sources, including literature and popular press articles published prior to and after the failure and NASA documents are reviewed to determine the utility of each paradigm. These reviews are supplemented by interviews with individuals involved in the failure and the official investigations that followed. This dissertation demonstrates that a combination of the classical and contemporary paradigms provides a more complete, and more accurate, picture of system failure. This combination links the non-deterministic elements of system failure analysis to the more conventional, deterministic theories. This new framework recognizes that the complete prevention of failure cannot be achieved; instead it makes provisions for preparing for and responding to system failure. / Ph. D.
47

Sharing the Shuttle with America: NASA and Public Engagement after Apollo

Kaminski, Amy Paige 30 March 2015 (has links)
Historical accounts depict NASA's interactions with American citizens beyond government agencies and aerospace firms since the 1950s and 1960s as efforts to 'sell' its human space flight initiatives and to position external publics as would-be observers, consumers, and supporters of such activities. Characterizing citizens solely as celebrants of NASA's successes, however, masks the myriad publics, engagement modes, and influences that comprised NASA's efforts to forge connections between human space flight and citizens after Apollo 11 culminated. While corroborating the premise that NASA constantly seeks public and political approval for its costly human space programs, I argue that maintaining legitimacy in light of shifting social attitudes, political priorities, and divided interest in space flight required NASA to reconsider how to serve and engage external publics vis-à-vis its next major human space program, the Space Shuttle. Adopting a sociotechnical imaginary featuring the Shuttle as a versatile technology that promised something for everyone, NASA sought to engage citizens with the Shuttle in ways appealing to their varied, expressed interests and became dependent on some publics' direct involvement to render the vehicle viable economically, socially, and politically. NASA's ability and willingness to democratize the Shuttle proved difficult to sustain, however, as concerns evolved following the Challenger accident among NASA personnel, political officials, and external publics about the Shuttle's purpose, value, safety, and propriety. Mapping the publics and engagement modes NASA regarded as crucial to the Shuttle's legitimacy, this case study exposes the visions of public accountability and other influences -- including changing perceptions of a technology -- that can govern how technoscientific institutions perceive and engage various external publics. Doing so illuminates the prospects and challenges associated with democratizing decisions and uses for space and, perhaps, other technologies managed by U.S. government agencies while suggesting a new pathway for scholarly inquiry regarding interactions between technoscientific institutions and external publics. Expanding NASA's historical narrative, this study demonstrates that entities not typically recognized as space program contributors played significant roles in shaping the Shuttle program, substantively and culturally. Conceptualizing and valuing external publics in these ways may prove key for NASA to sustain human space flight going forward. / Ph. D.
48

The Steep Climb to Low Earth Orbit: A History of the Space Elevator Community's Battle Against the Rocket Paradigm

Pearson, Derek J. 13 June 2022 (has links)
This thesis examines the growth of the space elevator community in America from 1975 to 2010. It argues that the continued practical failures of the space elevator, a proposed technology for efficiently transporting payloads and people into space without conventional propulsion sources, resulted from a technological paradigm built around the rocket and supported by a traditional engineering culture. After its triumph in landing men on the Moon from 1969 to 1972, the United States' National Aeronautics and Space Administration (NASA) sought to advance novel concepts for further space exploration, but it fumbled in pursuing nontraditional notions of escaping the atmosphere such as the space elevator. Employing interviews with space elevator advocates Bradley Edwards and Michael Laine and other primary and secondary sources, this thesis also draws on concepts such as technological paradigms, engineering cultures, and the technological sublime. It concludes by demonstrating how success eluded the marginalized space elevator researchers who found themselves grappling with the vast social and technical system that supported the rocket's hegemony. / Master of Arts / This thesis examines the growth of the space elevator community in America from 1975 to 2010. It argues that the continued practical failures of the space elevator, a proposed technology for efficiently transporting payloads and people into space without conventional propulsion sources, resulted from a technological paradigm built around the rocket and supported by a traditional engineering culture. The technological paradigm of the rocket encompassed all of the people and practices that made the rocket work. After its triumph in landing men on the Moon from 1969 to 1972, the United States' National Aeronautics and Space Administration (NASA) sought to advance novel concepts for further space exploration, but it fumbled in pursuing nontraditional notions of escaping the atmosphere such as the space elevator. Much of this failure is owed to an engineering culture within NASA that looked down upon challenging the rocket. This thesis demonstrates how success eluded the marginalized space elevator researchers who found themselves grappling with the vast social and technical system that supported the rocket's hegemony.
49

Untersuchungen an einer Kolbenexpansionsmaschine mit integrierten Wärmeübertragerflächen (Wärmeübertrager-Expander) zur Realisierung eines neuartigen Neon-Tieftemperatur-Prozesses

Fredrich, Ole 01 March 2005 (has links) (PDF)
Viele Anwendungen der Hochtemperatur-Supraleitung arbeiten vorteilhaft im Temperaturbereich zwischen 30 - 50 K. Für diesen Temperaturbereich existieren nur wenige geeignete Kältemaschinen mit kleiner Kälteleistung (1-2 W) u. gutem Wirkungsgrad. Neon ist aufgrund seiner Stoffeigenschaften ein hervorragendes Kältemittel für diesen Temperaturbereich, wie z.B. anhand einer realisierten Joule-Thomson (JT) Demonstrationsanlage deutlich wird. Als Ergebnis einer Prozessanalyse wird ein Kreislauf vorgestellt, der speziell den Eigenschaften von Neon angepasst ist. Durch die Überlagerung von Wärmeübertragung u. arbeitsleistender Expansion sowie der Einbeziehung einer JT-Stufe kann auch mit wenig effizienten Komponenten ein vergleichsweise hoher Gütegrad erreicht werden. Durch die Integration von Wärmeübertragerflächen in eine Kolbenexpansionsmaschine wird ein neues Konzept vorgeschlagen, um Kälte in einem großen Temperaturbereich in vielen Expansionsschritten zu erzeugen, ohne dafür viele Expander zu verwenden. Diese Einheit wird als Wärmeübertrager-Expander (WE) bezeichnet. Mit einem Arbeitsraum in konischer Grundform wird der Wärmeübergangskoeffizient günstig gestaltet u. die Wärmeübergangsfläche vergrößert. Mehrere Versuchsmaschinen wurden untersucht. Anhand der Versuche konnten die wesentlichen Verlustquellen u. Problembereiche identifiziert werden. Es wurde im Rahmen der Versuchsbedingungen nachgewiesen, dass für das vorgesehene Druckverhältnis eine nahe isotherme Expansion u. Kompression möglich ist. Es werden Möglichkeiten zur Verringerung der Längswärmeleitung vorgestellt. Zwei Simulationsprogramme wurden verwendet. Mit Hilfe des Wärmeübertrager-Programms wurden die Wärmeübertragungsvorgänge unter Berücksichtigung der Längswärmeleitung simuliert. Hierbei geht die Expansionsarbeit als stationäre Wärmesenke ein. Der im Ergebnis vorliegende stationäre Temperaturverlauf ist die Grundlage für die Berechnung der Expansionsarbeit unter Berücksichtigung der Realgaseigenschaften im Expander-Programm. Für die Neon-Tieftemperaturvariante wurde eine Grundvariante des WE definiert. Anhand dieser wurde mit Hilfe der Programme der Einfluss verschiedener Parameter auf Kälteleistung u. Gütegrad untersucht. Der WE wird als Teil des beschriebenen Prozesses mit einer JT-Stufe betrachtet. Die Kälteleistung weist sowohl in Abhängigkeit vom Massestrom als auch vom Hub ein Maximum auf. Der Shuttle-Verlust verschiebt durch Wärmetransport mittels des Kolbens die effektive Kälteleistung zu kleineren Hüben. Die durch die Güte (NTU) des JT-Wärmeübertragers bestimmte Eintrittstemperatur des Niederdruckstroms in den WE hat einen großen Einfluss auf die Kälteleistung. Mit steigender Eintrittstemperatur steigen der NTU-Wert für den Arbeitsraum u. somit auch die Kälteleistung. Das Maximum der Kälteleistung stimmt nicht mit dem Optimum für den Gütegrad überein. Der Gütegrad strebt mit sinkenden Masseströmen einem Optimum zu. Durch den zunehmenden Einfluss der Längswärmeleitung u. begrenzt durch die Minimalfüllung der Maschine aufgrund des Schadraumes ergibt sich ein Optimum. Der Einfluss des Massestroms ist entscheidend. Als untergeordnete Größen beeinflussen die Eintrittstemperatur des Niederdruckstroms u. der Hub den optimalen Gütegrad. Der Einfluss der Längswärmeleitung auf Kälteleistung u. Gütegrad wird exemplarisch anhand von vergleichenden Rechnungen gezeigt. Konkret kann für einen Eintrittsdruck von 200 bar, einen Austrittsdruck von 60 bar bei einer Eintrittstemperatur des Niederdruckstroms von 80 K für die Grundvariante eine maximale effektive Kälteleistung von 1,3 W mit einem Massestrom von 0,22 g/ s bei einem Hub von ca. 17 mm ausgewiesen werden. Der effektive Gütegrad für diese Bedingungen beträgt ca. 14%. Kommerzielle Split-Stirlingkühler erreichen bei 42 K einstufig Gütegrade von ca. 7%. Mit der vorgeschlagenen Konfiguration wird ein Konzept vorgestellt, das trotz technologisch offener Fragen das Gütegradniveau bekannter Kryokühler übertreffen kann. / Many applications of high temperature superconductivity are working advantageously within a temperature range between 30 K and 50 K. But for this temperature range only few suitable cryocooler with small refrigerating capacity (1-2 W) and good efficiency exist.Due to its properties Neon is an excellent refrigerant for this temperature level as an example with realised Joule-Thomson plant shows. A process analysis results in the presented cycle which is especially adapted to the properties of Neon. By combination of heat exchange and work extracting expansion and integration of a Joule-Thomson stage a high efficiency could be reached in spite of less efficient components.By arranging heat exchanger surfaces into a piston expansion machine a new concept is suggested to produce refrigeration in a large temperature range with a lot of expansion steps with reduced number of expanders. This unit is referred hereinafter to as heat exchanger-expander.The conical shaped working space results in an increase of the heat transfer coefficient and the heat transfer area.Several test machines were investigated. By means of testing the main loss sources and critical zones could be identified. The test results prove the opportunity of a near isothermal expansion and compression for the specified pressure ratio.Options to reduce the axial heat conduction are presented.Two simulation programs were utilised. Using the heat exchanger program the heat transfer is simulated in consideration of the axial heat conduction. Thereby the expansion work is considered as a stationary heat sink. The resulting stationary temperature pattern is the base for the expansion work calculation using the real gas properties in the expander program. Referring to the defined basic neon low temperature application the influence of different parameters on refrigerating capacity and efficiency was researched with the programs. The heat exchanger-expander is part of the described process with a Joule-Thomson stage. The refrigerating capacity shows a maximum depending as well from the mass flow as from the stroke. In result of the shuttle loss smaller strokes lead to better capacity due to heat transport with the piston.The inlet temperature of the low pressure flow influenced by the quality (NTU) of the Joule-Thomson heat exchanger has a large influence on the refrigerating capacity. With increasing inlet temperature the number of transfer units (NTU) for the fluid in the working volume increases and so the refrigerating capacity, too. The location of refrigerating capacity maximum and efficiency optimum is different. While decreasing mass flow efficiency is increasing to an optimum caused by the increased influence of axial heat conduction but limited by the minimum charge of the machine due to the dead space. The influence of the mass flow is dominating. As lower range values the inlet temperature of the low pressure flow and the stroke are influencing the optimal efficiency. The influence of axial heat conduction on refrigerating capacity and efficiency is shown using comparing calculations.For an inlet pressure of 200 bar, an outlet pressure of 60 bar, an inlet temperature of the low pressure flow of 80 K, a mass flow of 0,22 g/ s and a stroke of about 17 mm for the basic version of heat exchanger-expander a maximal effective refrigerating capacity of 1,3 We could be shown. The effective efficiency therefore is 14 %. Current commercial split Stirling cryocooler reach with single stage operation efficiencies of about 7 % at 42 K. The suggested configuration represents a concept that could be able to master the efficiency level of known cryocooler.
50

Development of a Flood Model Based on Globally-Available Satellite Data for the Papaloapan River, Mexico / Utvecklingen av en översvämningsmodell baserad på globalt tillgängliga satellitdata för floden Papaloapan, Mexiko

Kreiselmeier, Janis January 2015 (has links)
Flood inundation modelling is highly dependent on an accurate representation of floodplain topography. These  remotely  sensed  accurate  data  are  often  not  available  or  expensive,  especially  in  developing countries. As an alternative, freely available Digital Elevation Models (DEMs), such as the near-global Shuttle Radar Topography Mission (SRTM) data, have come into the focus of flood modellers. To what extent  these  low-resolution  data  can  be  exploited  for  hydraulic  modelling  is  still  an  open  research question. This benchmarking study investigated the potentials and limitations of the SRTM data set for flood inundation  modelling  on  the  example  of  the  Papaloapan  River,  Mexico.  Furthermore  the  effects  of vegetation signal removal from the SRTM DEM as in Baugh et al. (2010) were tested. A reference model based on a light detection and ranging (LiDAR) DEM was set up with the model code LISFLOOD-FP and run for two flood events. Test models based on SRTM DEMs were run and output flood extents compared to the reference model by applying a measure of fit. This measure of fit, which was based on binary wet/dry maps of both model outputs, gave information on how well the test models simulated the flood inundation extents compared to the reference model by giving a percentage of the model performance from theoretically 0 to 100 %. SRTM-based models could not reproduce the promising results of previous studies. Flood extents were mostly underestimated and commonly flooded areas were almost exclusively made up out of the main channel surface. One of the reasons for this likely was the much steeper slope of the SRTM DEM as opposed to the LiDAR DEM where water probably was conducted much faster though the main channel. Too high bank cells as well as generally more pronounced elevation differences of the SRTM DEM throughout the whole floodplain were another problem of the SRTM DEM preventing accurate flood inundation simulations. Vegetation  signal  removal  was  successful  to  a  certain  degree  improving  the  fit  by  about  10 %. However, a realistic shape of flood extent could not be simulated due to too big pixel sizes of the used canopy  height  data  set. Also,  the  conditioned  models  overestimated  flooded  areas  with  increasing vegetation signal removal, rendering some of the models useless for comparison, as water leaving the model domain could not be accounted for in the measure of fit. This study showed the limitations of SRTM data for flood inundation modeling where an accurate approximation of the river slope as well as accurately captured bank cells and floodplain topography are crucial for the simulated outcome. Vegetation signal removal has been shown to be potentially useful but should rather be applied on more densely covered catchments. / Översvämningar skapar stora problem världen över och fler och fler människor lever i områden som är utsatta för risk för att svämmas över. Dessutom förväntas översvämningar förekomma mer frekvent i många delar av världen i framtiden på grund av klimatförändringar. Skada orsakad av översvämningar kan  överstiga  flera  miljarder  US$.  Men  översvämningar  orsakar  också  andra  problem,  förutom ekonomiska förluster. De senaste 10 åren har mer än 60 000 människor dött på grund av översvämningar. Ytterligare 900 000 000 människor har drabbats på något sätt. Därför är det viktigt att man vet vilka områden som är utsatta för hög risk. Ett av de verktyg för att avgöra  översvämningsrisker  är  hydrauliska  datormodeller  som  försöker  förutse  hur  en  bestämd översvämning breder ut sig. Modellerna är baserade på fysiska principer och topografisk information. Helst vill man ha topografisk information med hög kvalitet och upplösning. Ofta har man data från fjärranalyser, insamlade från flygplan. Ett exempel på det är LiDAR-data som är baserad på laser. Dock är det ofta dyrt eller inte tillgängligt med LiDAR i avlägsna områden och utvecklingsländer, där man behöver sådan data som mest. Därför har forskare försökt att använda globalt tillgängliga topografiska data av låg kvalitet för hydrauliska modeller. En sådan datauppsättning är det så kallade SRTM-datat från amerikanska NASA. SRTM samlas in med hjälp av radarstrålar från satelliter. I flera studier har man fått goda resultat inom översvämningsmodellering med SRTM. Dock måste man testa det vidare för fler avrinningsområden. I den här studien har man försökt att använda SRTM i en hydraulisk modell för den mexikanska floden  Papaloapan.  För  att  se  hur  bra  (eller  dålig)  SRTM-modellen  är  för  att  simulera  hur  en översvämning sprids har man jämfört den med en modell baserad på högkvalitativ LiDAR-data. Båda modellerna  simulerade  samma  översvämningar. Topografiska  information  från  SRTM-data  är  oftast inkorrekt där det finns väldigt tät och hög vegetation, eftersom radarsignalen då inte räcker till marken och den uppskattade höjden är därför för hög i sådana områden. Av denna anledning ville man därför i denna  studie  även  testa  hur  resultatet  av  SRTM-modellen  skulle  förbättras  om  man  tog  bort  viss vegetation. Dessvärre var den utformade SRTM-modellen inte så bra för det här fallstudieområdet och SRTM-modellen  förutspådde  mycket  mindre  översvämningar  än  den  förmodade  mer  korrekta  LiDAR-modellen. Då vegetation avlägsnandes kunde man förbättra SRTM-modellen till viss mån, men det var fortfarande  inte  tillräckligt  för  det  här  området.  Denna  studie  visar  att  det  är  viktigt  att  fortsätta undersöka hur passande och användbart SRTM är, eftersom det har visat sig att SRTM inte är lämpligt för att förutspå översvämningar i alla delar av världen.

Page generated in 0.0445 seconds