• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 27
  • 5
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 88
  • 88
  • 34
  • 32
  • 27
  • 15
  • 12
  • 12
  • 11
  • 10
  • 9
  • 9
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Spectral And Temporal Zero-Crossings-Based Signal Analysis

Shenoy, Ravi R 01 1900 (has links) (PDF)
We consider real zero-crossing analysis of the real/imaginary parts of the spectrum, namely, spectral zero-crossings (SZCs). The two major contributions are to show that: (i) SZCs provide enable temporal localization of transients; and (b) SZCs are suitable for modeling transient signals. We develop a spectral dual of Kedem’s result linking temporal zero-crossing rate (ZCR) to the spectral centroid. The key requirement is stationarity, which we achieve through random-phase modulations of the time-domain signal. Transient signals are not amenable to modelling in the time domain since they are bursts of energy localized in time and lack structure. We show that the spectrum of transient signals have a rich modulation structure, which leads to an amplitude-modulation – frequency-modulation (AM-FM) model of the spectrum. We generalize Kedem’s arc-cosine formula for lags greater than one. For the specific case of a sinusoid in white Gaussian noise, He and Kedem devised an iterative filtering algorithm, which leads to a contraction mapping. An autoregressive filter of order one is employed and the location of the pole is the parameter that is updated based on the filtered output. We use the higher-order property, which relates the autocorrelation to the expected ZCR of the filtered process, between lagged ZCR and higher-lag autocorrelation to develop an iterative higher-order autoregressive-filtering scheme, which stabilizes the ZCR and consequently provides robust estimates of the autocorrelation at higher lags. Next, we investigate ZC properties of critically sampled outputs of a maximally decimated M-channel power complementary analysis filterbank (PCAF) and derive the relationship between the ZCR of the input Gaussian process at lags that are integer multiples of M in terms of the subband ZCRs. Based on this result, we propose a robust autocorrelation estimator for a signal consisting of a sum of sinusoids of fixed amplitudes and uniformly distributed random phases. Robust subband ZCRs are obtained through iterative filtering and the subband variances are estimated using the method-of-moments estimator. We compare the performance of the proposed estimator with the sample auto-correlation estimate in terms of bias, variance, and mean-squared error, and show through simulations that the performance of the proposed estimator is better than the sample auto- correlation for medium to low SNR. We then consider the ZC statistics of the real/imaginary parts of the discrete Fourier spectrum. We introduce the notion of the spectral zero-crossing rate (SZCR) and show that, for transients, it gives information regarding the location of the transient. We also demonstrate the utility of SZCR to estimate interaural time delay between the left and right head-related impulse responses. The accuracy of interaural time delay plays a vital role in binaural synthesis and a comparison of the performance of the SZCR estimates with that of the cross-correlation estimates illustrate that spectral zeros alone contain enough information for accurately estimating interaural time delay. We provide a mathematical formalism for establishing the dual of the link between zero-crossing rate and spectral centroid. Specifically, we show that the expected SZCR of a stationary spectrum is a temporal centroid. For a deterministic sequence, we obtain the stationary spectrum by modulating the sequence with a random phase unit amplitude sequence and then computing the spectrum. The notion of a stationary spectrum is necessary for deriving counterparts of the results available in temporal zero-crossings literature. The robustness of location information embedded in SZCR is analyzed in presence of a second transient within the observation window, and also in the presence of additive white Gaussian noise. A spectral-domain iterative filtering scheme based on autoregressive filters is presented and improvement in the robustness of the location estimates is demonstrated. As an application, we consider epoch estimation in voiced speech signals and show that the location information is accurately estimated using spectral zeros than other techniques. The relationship between temporal centroid and SZCR also finds applications in frequency-domain linear prediction (FDLP), which is used in audio compression. The prediction coefficients are estimated by solving the Yule-Walker equations constructed from the spectral autocorrelation. We use the relationship between the spectral autocorrelation and temporal centroid to obtain the spectral autocorrelation directly by time-domain windowing without explicitly computing the spectrum. The proposed method leads to identical results as the standard FDLP method but with reduced computational load. We then develop a SZCs-based spectral-envelope and group-delay (SEGD) model, which finds applications in modelling of non-stationary signals such as Castanets. Taking into account the modulation structure and spectral continuity, local polynomial regression is performed to estimate the GD from the real spectral zeros. The SE is estimated based on the phase function computed from the estimated GD. Since the GD estimate is parametric, the degree of smoothness can be controlled directly. Simulation results based on synthetic transient signals are presented to analyze the noise-robustness of the SE-GD model. Applications to castanet modeling, transient compression, and estimation of the glottal closure instants in speech are shown.
82

Urbanisation, Land Use and Soil Resource: Spatio-Temporal Analyses of Trends and Environmental Effects in Two Metropolitan Regions of Ghana (West Africa)

Asabere, Stephen Boahen 19 June 2020 (has links)
No description available.
83

Statistical models and stochastic algorithms for the analysis of longitudinal Riemanian manifold valued data with multiple dynamic / Modèles statistiques et algorithmes stochastiques pour l’analyse de données longitudinales à dynamiques multiples et à valeurs sur des variétés riemaniennes

Chevallier, Juliette 26 September 2019 (has links)
Par delà les études transversales, étudier l'évolution temporelle de phénomènes connait un intérêt croissant. En effet, pour comprendre un phénomène, il semble plus adapté de comparer l'évolution des marqueurs de celui-ci au cours du temps plutôt que ceux-ci à un stade donné. Le suivi de maladies neuro-dégénératives s'effectue par exemple par le suivi de scores cognitifs au cours du temps. C'est également le cas pour le suivi de chimiothérapie : plus que par l'aspect ou le volume des tumeurs, les oncologues jugent que le traitement engagé est efficace dès lors qu'il induit une diminution du volume tumoral.L'étude de données longitudinales n'est pas cantonnée aux applications médicales et s'avère fructueuse dans des cadres d'applications variés tels que la vision par ordinateur, la détection automatique d'émotions sur un visage, les sciences sociales, etc.Les modèles à effets mixtes ont prouvé leur efficacité dans l'étude des données longitudinales, notamment dans le cadre d'applications médicales. Des travaux récent (Schiratti et al., 2015, 2017) ont permis l'étude de données complexes, telles que des données anatomiques. L'idée sous-jacente est de modéliser la progression temporelle d'un phénomène par des trajectoires continues dans un espace de mesures, que l'on suppose être une variété riemannienne. Sont alors estimées conjointement une trajectoire moyenne représentative de l'évolution globale de la population, à l'échelle macroscopique, et la variabilité inter-individuelle. Cependant, ces travaux supposent une progression unidirectionnelle et échouent à décrire des situations telles que la sclérose en plaques ou le suivi de chimiothérapie. En effet, pour ces pathologies, vont se succéder des phases de progression, de stabilisation et de remision de la maladie, induisant un changement de la dynamique d'évolution globale.Le but de cette thèse est de développer des outils méthodologiques et algorithmiques pour l’analyse de données longitudinales, dans le cas de phénomènes dont la dynamique d'évolution est multiple et d'appliquer ces nouveaux outils pour le suivi de chimiothérapie. Nous proposons un modèle non-linéaire à effets mixtes dans lequel les trajectoires d'évolution individuelles sont vues comme des déformations spatio-temporelles d'une trajectoire géodésique par morceaux et représentative de l'évolution de la population. Nous présentons ce modèle sous des hypothèses très génériques afin d'englober une grande classe de modèles plus spécifiques.L'estimation des paramètres du modèle géométrique est réalisée par un estimateur du maximum a posteriori dont nous démontrons l'existence et la consistance sous des hypothèses standards. Numériquement, du fait de la non-linéarité de notre modèle, l'estimation est réalisée par une approximation stochastique de l'algorithme EM, couplée à une méthode de Monte-Carlo par chaînes de Markov (MCMC-SAEM). La convergence du SAEM vers les maxima locaux de la vraisemblance observée ainsi que son efficacité numérique ont été démontrées. En dépit de cette performance, l'algorithme SAEM est très sensible à ses conditions initiales. Afin de palier ce problème, nous proposons une nouvelle classe d'algorithmes SAEM dont nous démontrons la convergence vers des minima locaux. Cette classe repose sur la simulation par une loi approchée de la vraie loi conditionnelle dans l'étape de simulation. Enfin, en se basant sur des techniques de recuit simulé, nous proposons une version tempérée de l'algorithme SAEM afin de favoriser sa convergence vers des minima globaux. / Beyond transversal studies, temporal evolution of phenomena is a field of growing interest. For the purpose of understanding a phenomenon, it appears more suitable to compare the evolution of its markers over time than to do so at a given stage. The follow-up of neurodegenerative disorders is carried out via the monitoring of cognitive scores over time. The same applies for chemotherapy monitoring: rather than tumors aspect or size, oncologists asses that a given treatment is efficient from the moment it results in a decrease of tumor volume. The study of longitudinal data is not restricted to medical applications and proves successful in various fields of application such as computer vision, automatic detection of facial emotions, social sciences, etc.Mixed effects models have proved their efficiency in the study of longitudinal data sets, especially for medical purposes. Recent works (Schiratti et al., 2015, 2017) allowed the study of complex data, such as anatomical data. The underlying idea is to model the temporal progression of a given phenomenon by continuous trajectories in a space of measurements, which is assumed to be a Riemannian manifold. Then, both a group-representative trajectory and inter-individual variability are estimated. However, these works assume an unidirectional dynamic and fail to encompass situations like multiple sclerosis or chemotherapy monitoring. Indeed, such diseases follow a chronic course, with phases of worsening, stabilization and improvement, inducing changes in the global dynamic.The thesis is devoted to the development of methodological tools and algorithms suited for the analysis of longitudinal data arising from phenomena that undergo multiple dynamics and to apply them to chemotherapy monitoring. We propose a nonlinear mixed effects model which allows to estimate a representative piecewise-geodesic trajectory of the global progression and together with spacial and temporal inter-individual variability. Particular attention is paid to estimation of the correlation between the different phases of the evolution. This model provides a generic and coherent framework for studying longitudinal manifold-valued data.Estimation is formulated as a well-defined maximum a posteriori problem which we prove to be consistent under mild assumptions. Numerically, due to the non-linearity of the proposed model, the estimation of the parameters is performed through a stochastic version of the EM algorithm, namely the Markov chain Monte-Carlo stochastic approximation EM (MCMC-SAEM). The convergence of the SAEM algorithm toward local maxima of the observed likelihood has been proved and its numerical efficiency has been demonstrated. However, despite appealing features, the limit position of this algorithm can strongly depend on its starting position. To cope with this issue, we propose a new version of the SAEM in which we do not sample from the exact distribution in the expectation phase of the procedure. We first prove the convergence of this algorithm toward local maxima of the observed likelihood. Then, with the thought of the simulated annealing, we propose an instantiation of this general procedure to favor convergence toward global maxima: the tempering-SAEM.
84

Spatiotemporal analysis of criteria air pollutants and volatile organic compounds from a moving vehicle

Davidson, Jon 31 August 2021 (has links)
This thesis describes the on-road analysis of criteria air pollutants (CAPs) and volatile organic compounds (VOCs) from a moving vehicle. CAPs and VOCs have numerous direct and indirect effects on the environment and public health and are generated from a variety of point and diffuse sources. The concentration of these pollutants can vary on the scale of metres and seconds due to variable emission rates of sources, meteorology, and the topography of an area. CAPs are conventionally measured on a spatial scale of tens of kilometres and one hour or longer time resolution, which limits the understanding of their impact and leaving many communities lacking information regarding their air quality. VOCs are not measured as frequently as CAPs, owing to the difficulty, challenges, and cost associated with sampling. The Mobile Mass Spectrometry Lab (MMSL) was developed to collect high geospatial (15 – 1,500 m) and temporal (1 – 10 s) resolution measurements of CAPs (O3, NOx, PM2.5), CO2, CH4, and VOCs. CAPs and greenhouse gases were monitored using standard analyzers, while VOCs were measured using a proton-transfer reaction time-of-flight mass spectrometer (PTR-MS). PTR-MS is a real-time, direct, in situ technique that can monitor VOCs in the ambient atmosphere without sample collection. The PTR-MS monitored up to mass-to-charge 330 with a sample integration time of 1 or 10 seconds and had detection limits into the low- to mid-ppt. PTR-MS is a soft ionization technique that is selective to all compounds with a proton affinity less than water, which excludes the atmospheric matrix and includes most VOCs. The measurements provided by the PTR-MS provided a rich dataset for which to develop workflow and processing methods alongside sampling strategies for the collection of high geospatial and temporal VOC data. The first on-road deployment of the MMSL was performed across the Regional District of Nanaimo and the Alberni-Clayoquot Regional District in British Columbia, Canada, from July iv 2018 – April 2019 to monitor the geospatial and temporal variation in the concentration of CAPs and VOCs. VOCs detected in the areas include hydrocarbons like toluene, C2-benzenes, and terpenes, organic acids like acetic acid, oxygenated compounds like acetone and acetaldehyde, and reduced sulfur compounds like methanethiol and dimethyl sulfide. While observed concentrations of VOCs were mostly below detection limits, concentration excursions upwards of 2,200 ppb for C2-benzenes (reported as ethylbenzene) for instance, were observed across the various communities and industries that comprise central Vancouver Island. VOCs like monoterpenes, were observed near the wood industries up to 229 ppb. Combustion related VOCs, like toluene and C2-benzenes, were often observed on major transportation corridors and was found to vary significantly between seasons, with winter measurements often exceeding those made in the summer. Reduced sulfur compounds, common components of nuisance odours, were measured around a few industries like waste management and wood industries. The second on-road deployment of the MMSL focused on the analysis of VOCs in the community around a wastewater treatment plant (WWTP) to identify the source of odours in the area. VOCs were also monitored in the odour control process of the WWTP to identify the VOCs being emitted, how much were emitted, and where potential deficiencies were in the process in a unique study. Median emission rates at the facility for methanethiol, dimethyl sulfide, and dimethyl disulfide were determined to be 100, 19, and 21 kg yr-1, respectively. VOC monitoring in the community encompassed the WWTP and the other major industries in the area, including agricultural land, a composting facility, and a marina. The highest measurements of odorous reduced sulfur compounds were observed around the WWTP, upwards of 36 ppb for methanethiol. Unsupervised multivariate analysis was performed to identify groups of VOCs present and their potential sources. Three groups were identified, one of which was related to reduced sulfur compounds. This group was observed around the WWTP, indicating that the WWTP was the likely source of malodours in the community. / Graduate
85

Dynamická analýza konstrukce zatížené seismickým zatížením / Dynamic analysis of structure loaded seismic loads

Havlíková, Ivana January 2012 (has links)
The purpose of my master’s thesis is the solution steel hall with concrete columns, that is loaded by an earthquake. This simulation program was used RFEM. To calculate was used the spectral and temporal analysis, and that on models of structures with several combinations of materials. The analysis was performed for both the general direction of the earthquake, so for combinations of directions according to standard procedures in EC8.
86

Marco Polo's Travels Revisited: From Motion Event Detection to Optimal Path Computation in 3D Maps

Niekler, Andreas, Wolska, Magdalena, Wiegmann, Matti, Stein, Benno, Burghardt, Manuel, Thiel, Marvin 11 July 2024 (has links)
In this work, we present a workflow for semi-automatic extraction of geo-references and motion events from the book 'The Travels of Marco Polo'. These are then used to create 3D renderings of the space and movement which allows readers to visually trace Marco Polo's route themselves to provide the exprience of the entirety of the journey
87

An integrated GIS-based and spatiotemporal analysis of traffic accidents: a case study in Sherbrooke

Harirforoush, Homayoun January 2017 (has links)
Abstract: Road traffic accidents claim more than 1,500 lives each year in Canada and affect society adversely, so transport authorities must reduce their impact. This is a major concern in Quebec, where the traffic-accident risks increase year by year proportionally to provincial population growth. In reality, the occurrence of traffic crashes is rarely random in space-time; they tend to cluster in specific areas such as intersections, ramps, and work zones. Moreover, weather stands out as an environmental risk factor that affects the crash rate. Therefore, traffic-safety engineers need to accurately identify the location and time of traffic accidents. The occurrence of such accidents actually is determined by some important factors, including traffic volume, weather conditions, and geometric design. This study aimed at identifying hotspot locations based on a historical crash data set and spatiotemporal patterns of traffic accidents with a view to improving road safety. This thesis proposes two new methods for identifying hotspot locations on a road network. The first method could be used to identify and rank hotspot locations in cases in which the value of traffic volume is available, while the second method is useful in cases in which the value of traffic volume is not. These methods were examined with three years of traffic-accident data (2011–2013) in Sherbrooke. The first method proposes a two-step integrated approach for identifying traffic-accident hotspots on a road network. The first step included a spatial-analysis method called network kernel-density estimation. The second step involved a network-screening method using the critical crash rate, which is described in the Highway Safety Manual. Once the traffic-accident density had been estimated using the network kernel-density estimation method, the selected potential hotspot locations were then tested with the critical-crash-rate method. The second method offers an integrated approach to analyzing spatial and temporal (spatiotemporal) patterns of traffic accidents and organizes them according to their level of significance. The spatiotemporal seasonal patterns of traffic accidents were analyzed using the kernel-density estimation; it was then applied as the attribute for a significance test using the local Moran’s I index value. The results of the first method demonstrated that over 90% of hotspot locations in Sherbrooke were located at intersections and in a downtown area with significant conflicts between road users. It also showed that signalized intersections were more dangerous than unsignalized ones; over half (58%) of the hotspot locations were located at four-leg signalized intersections. The results of the second method show that crash patterns varied according to season and during certain time periods. Total seasonal patterns revealed denser trends and patterns during the summer, fall, and winter, then a steady trend and pattern during the spring. Our findings also illustrated that crash patterns that applied accident severity were denser than the results that only involved the observed crash counts. The results clearly show that the proposed methods could assist transport authorities in quickly identifying the most hazardous sites in a road network, prioritizing hotspot locations in a decreasing order more efficiently, and assessing the relationship between traffic accidents and seasons. / Les accidents de la route sont responsables de plus de 1500 décès par année au Canada et ont des effets néfastes sur la société. Aux yeux des autorités en transport, il devient impératif d’en réduire les impacts. Il s’agit d’une préoccupation majeure au Québec depuis que les risques d’accidents augmentent chaque année au rythme de la population. En réalité, les accidents routiers se produisent rarement de façon aléatoire dans l’espace-temps. Ils surviennent généralement à des endroits spécifiques notamment aux intersections, dans les bretelles d’accès, sur les chantiers routiers, etc. De plus, les conditions climatiques associées aux saisons constituent l’un des facteurs environnementaux à risque affectant les taux d’accidents. Par conséquent, il devient impératif pour les ingénieurs en sécurité routière de localiser ces accidents de façon plus précise dans le temps (moment) et dans l’espace (endroit). Cependant, les accidents routiers sont influencés par d’importants facteurs comme le volume de circulation, les conditions climatiques, la géométrie de la route, etc. Le but de cette étude consiste donc à identifier les points chauds au moyen d’un historique des données d’accidents et de leurs répartitions spatiotemporelles en vue d’améliorer la sécurité routière. Cette thèse propose deux nouvelles méthodes permettant d’identifier les points chauds à l’intérieur d’un réseau routier. La première méthode peut être utilisée afin d’identifier et de prioriser les points chauds dans les cas où les données sur le volume de circulation sont disponibles alors que la deuxième méthode est utile dans les cas où ces informations sont absentes. Ces méthodes ont été conçues en utilisant des données d’accidents sur trois ans (2011-2013) survenus à Sherbrooke. La première méthode propose une approche intégrée en deux étapes afin d’identifier les points chauds au sein du réseau routier. La première étape s’appuie sur une méthode d’analyse spatiale connue sous le nom d’estimation par noyau. La deuxième étape repose sur une méthode de balayage du réseau routier en utilisant les taux critiques d’accidents, une démarche éprouvée et décrite dans le manuel de sécurité routière. Lorsque la densité des accidents routiers a été calculée au moyen de l’estimation par noyau, les points chauds potentiels sont ensuite testés à l’aide des taux critiques. La seconde méthode propose une approche intégrée destinée à analyser les distributions spatiales et temporelles des accidents et à les classer selon leur niveau de signification. La répartition des accidents selon les saisons a été analysée à l’aide de l’estimation par noyau, puis ces valeurs ont été assignées comme attributs dans le test de signification de Moran. Les résultats de la première méthode démontrent que plus de 90 % des points chauds à Sherbrooke sont concentrés aux intersections et au centre-ville où les conflits entre les usagers de la route sont élevés. Ils révèlent aussi que les intersections contrôlées sont plus à risque par comparaison aux intersections non contrôlées et que plus de la moitié des points chauds (58 %) sont situés aux intersections à quatre branches (en croix). Les résultats de la deuxième méthode montrent que les distributions d’accidents varient selon les saisons et à certains moments de l’année. Les répartitions saisonnières montrent des tendances à la densification durant l’été, l’automne et l’hiver alors que les distributions sont plus dispersées au cours du printemps. Nos observations indiquent aussi que les répartitions ayant considéré la sévérité des accidents sont plus denses que les résultats ayant recours au simple cumul des accidents. Les résultats démontrent clairement que les méthodes proposées peuvent: premièrement, aider les autorités en transport en identifiant rapidement les sites les plus à risque à l’intérieur du réseau routier; deuxièmement, prioriser les points chauds en ordre décroissant plus efficacement et de manière significative; troisièmement, estimer l’interrelation entre les accidents routiers et les saisons.
88

Fine grained sediment clean-up in a modern urban environment

Villemure, Marlene January 2013 (has links)
Fine grained sediment deposition in urban environments during natural hazard events can impact critical infrastructure and properties (urban terrain) leading to reduced social and economic function and potentially adverse public health effects. Therefore, clean-up of the sediments is required to minimise impacts and restore social and economic functionality as soon as possible. The strategies employed to manage and coordinate the clean-up significantly influence the speed, cost and quality of the clean-up operation. Additionally, the physical properties of the fine grained sediment affects the clean-up, transport, storage and future usage of the sediment. The goals of the research are to assess the resources, time and cost required for fine grained sediment clean-up in an urban environment following a disaster and to determine how the geotechnical properties of sediment will affect urban clean-up strategies. The thesis focuses on the impact of fine grained sediment (<1 mm) deposition from three liquefaction events during the Canterbury earthquake sequence (2010-2011) on residential suburbs and transport networks in Christchurch. It also presents how geotechnical properties of the material may affect clean-up strategies and methods by presenting geotechnical analysis of tephra material from the North Island of New Zealand. Finally, lessons for disaster response planning and decision making for clean-up of sediment in urban environments are presented. A series of semi-structured interviews of key stakeholders supported by relevant academic literature and media reports were used to record the clean-up operation coordination and management and to make a preliminary qualification of the Christchurch liquefaction ejecta clean-up (costs breakdown, time, volume, resources, coordination, planning and priorities). Further analysis of the costs and resources involved for better accuracy was required and so the analysis of Christchurch City Council road management database (RAMM) was done. In order to make a transition from general fine sediment clean-up to specific types of fine disaster sediment clean-up, adequate information about the material properties is required as they will define how the material will be handled, transported and stored. Laboratory analysis of young volcanic tephra from the New Zealand’s North Island was performed to identify their geotechnical properties (density, granulometry, plasticity, composition and angle of repose). The major findings of this research were that emergency planning and the use of the coordinated incident management system (CIMS) system during the emergency were important to facilitate rapid clean-up tasking, management of resources and ultimately recovery from widespread and voluminous liquefaction ejecta deposition in eastern Christchurch. A total estimated cost of approximately $NZ 40 million was calculated for the Christchurch City clean-up following the 2010-2011 Canterbury earthquake sequence with a partial cost of $NZ 12 million for the Southern part of the city, where up to 33% (418 km) of the road network was impacted by liquefaction ejecta and required clearing of the material following the 22 February 2011 earthquake. Over 500,000 tonnes of ejecta has been stockpiled at Burwood landfill for all three liquefaction inducing earthquake events. The average cost per kilometre for the event clean-up was $NZ 5,500/km (4 September 2010), $NZ 11,650/km (22 February 2011) and $NZ 11,185/km (13 June 2011). The duration of clean-up time of residential properties and the road network was approximately two to three months for each of the three liquefaction ejecta events; despite events volumes and spatial distribution of ejecta. Interviews and quantitative analysis of RAMM data revealed that the experience and knowledge gained from the Darfield earthquake (4 September 2010) clean-up increased the efficiency of the following Christchurch earthquake induced liquefaction ejecta clean-up events. Density, particle size, particle shape, clay content and moisture content, are the important geotechnical properties that need to be considered when planning for a clean-up method that incorporates collection, transport and disposal or storage. The geotechnical properties for the tephra samples were analysed to increase preparedness and reaction response of potentially affected North Island cities from possible product from the active volcanoes in their region. The geotechnical results from this study show that volcanic tephra could be used in road or construction material but the properties would have to be further investigated for a New Zealand context. Using fresh volcanic material in road, building or flood control construction requires good understanding of the material properties and precaution during design and construction to extra care, but if well planned, it can be economically beneficial.

Page generated in 0.1048 seconds