221 |
Relative motion history of the Pacific-Nazca (Farallon) plates since 30 million years ago [electronic resource] / by Douglas T. Wilder.Wilder, Douglas T. January 2003 (has links)
Title from PDF of title page. / Document formatted into pages; contains 105 pages. / Thesis (M.S.)--University of South Florida, 2003. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: Relative plate motion history since 30 Ma between the Pacific and the southern portion of the Nazca (Farallon) plates is examined. The history is constrained by available seafloor magnetic anomaly data and a two-minute grid of predicted bathymetry derived from satellite altimetry and shipboard sensors. These data are used to create a new plate motion reconstruction based on new magnetic anomaly identifications and finite poles of motion. The new identified magnetic isochrons and tectonic reconstruction provides greater resolution to the tectonic history between chrons 7y (24.73 Ma) and 3 (4.18 Ma) than previous interpretations. Shipboard magnetics and aeromagnetic data from over 250 expeditions were plotted and used to extrapolate magnetic anomalies picked from 2D magnetic modeling from selected cruises. Magnetic anomalies were further constrained by tectonic features evident in the predicted bathymetry. / ABSTRACT: Previously published magnetic anomaly locations consistent with this work were used where interpretation could not be constrained by 2D modeling and map extrapolation. Point locations for anomalies were used as input for calculation of finite poles of motion for chrons 10y, 7y, 6c, 5d, 5b, 5aa, 5o, 4a and 3a. An iterative process of anomaly mapping, pole calculation and anomaly point rotations was used to refine the finite poles of motion. Eleven stage poles were calculated from the nine finite poles from this study and two published instantaneous Euler vectors. Tectonic reconstructions indicate a history dominated by two major southward ridge propagation events, the first starting by 28 Ma and completed by 18 Ma. The second event initiated in association with breakup of the Farallon plate around 24 Ma and ceased by about 11 Ma. Lithosphere was transferred from Nazca to Pacific during the first event and in the opposite sense during the second. / ABSTRACT: Development of the Mendoza microplate east of the later propagator occurred at about 20 Ma and this dual spreading process appears to have lasted until about 15 Ma. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.
|
222 |
Optimized Correlation of Geophysical And Geotechnical Methods In Sinkhole Investigations: Emphasizing On Spatial Variations In West-Central FloridaKiflu, Henok Gidey 01 January 2013 (has links)
Abstract
Sinkholes and sinkhole-related features in West-Central Florida (WCF) are commonly identified using geotechnical investigations such as standard penetration test (SPT) borings and geophysical methods such as ground penetrating radar (GPR) and electrical resistivity tomography (ERT). Geophysical investigation results can be used to locate drilling and field testing sites while geotechnical investigation can be used to ground truth geophysical results. Both methods can yield complementary information. Geotechnical investigations give important information about the type of soil, groundwater level and presence of low-density soils or voids at the test location, while geophysical investigations like GPR surveys have better spatial coverage and can resolve shallow stratigraphic indicators of subsidence.
In GPR profiles collected at 103 residential sites in covered-karst terrain in WCF, sinkhole-related anomalies are identified using GPR and SPT methods. We analyze the degree to which the shallow features imaged in GPR correlate spatially with the N-values (blow counts) derived from SPTs at the 103 residential sites. GPR anomalies indicating sinkhole activity are defined as zones where subsurface layers show local downwarping, discontinuities, or sudden increases in amplitude or penetration of the GPR signal. "Low SPT values" indicating sinkhole activity are defined using an optimization code that searched for threshold SPT value showing optimum correlation between GPR and SPT for different optimal depth ranges. We also compared these criteria with other commonly used geotechnical criteria such as weight of rod and weight of hammer conditions.
Geotechnical results were also used to filter the data based on site characteristics such as presence of shallow clay layers to study the effectiveness of GPR at different zones. Subsets of the dataset are further analyzed based on geotechnical results such as clay thickness, bedrock depth, groundwater conditions and other geological factors such as geomorphology, lithology, engineering soil type, soil thickness and prevalent sinkhole type. Results are used to examine (1) which SPT indicators show the strongest correlations with GPR anomalies, (2) the degree to which GPR surveys improve the placement of SPT borings, and (3) what these results indicate about the structure of sinkholes at these sites.
For the entire data set, we find a statistically significant correlation between GPR anomalies and low SPT N-values with a confidence level of 90%. Logistic regression analysis shows that the strongest correlations are between GPR anomalies and SPT values measured in the depth range of 0-4.5 m. The probability of observing a GPR anomaly on a site will decrease by up to 84% as the minimum SPT value increases from 0 to 20 in the general study area. Boreholes drilled on GPR anomalies are statistically significantly more likely to show zones of anomalously low SPT values than boreholes drilled off GPR anomalies. We also find that the optimum SPT criteria result in better correlation with GPR than other simple commonly used geotechnical criteria such as weight of rod and weight of hammer. Better correlations were found when sites with poor GPR penetrations are filtered out from the dataset. The odds ratio showed similar result while the result varied with the depth range, statistics and threshold SPT value (low N- value with optimum correlation), with a maximum observed odds ratio of 3.
Several statistical results suggest that raveling zones that connect voids to the surface may be inclined, so that shallow GPR anomalies are laterally offset from deeper zones of low N-values. Compared to the general study area, we found locally stronger correlation in some sub-regions. For example, the odds ratio found for tertiary hawthorn subgroup were 25 times higher than the odds ratio found for the general study area (WCF).
|
223 |
Short and Long Term Volcano Instability Studies at Concepción Volcano, NicaraguaSaballos, Jose Armando 01 January 2013 (has links)
Concepción is the most active composite volcano in Nicaragua, and is located on Ometepe Island, within Lake Nicaragua. Moderate to small volcanic explosions with a volcanic explosivity index (VEI) of 1-2 have been characteristic of this volcano during the last four decades. Although its current activity is not violent, its volcanic deposits reveal stages of violent activity involving Plinian and sub-Plinian eruptions that deposited vast amounts of volcanic tephra in the Atlantic Ocean. These observations, together with the 31,000 people living on the island, make Concepción volcano an important target for volcanological research.
My research focuses on the investigation of the stability of the volcano edifice of Concepción, using geophysical data such as gravity, geodetic global positioning system (GPS), sulphur dioxide (SO2) flux, real-time seismic amplitude (RSAM), and satellite remotely-sensed data. The integration of these data sets provides information about the short-term behavior of Concepción, and some insights into the volcano's long-term behavior.
This study has provided, for the first time, information about the shallow dynamics of Concepción on time scales of days to weeks. I furnish evidence that this volcano is not gravitationally spreading in a continuous fashion as previously thought, that its bulk average density is comparable to that of a pile of gravel, that the volcano edifice is composed of two major distinctive lithologies, that the deformation field around the volcano is recoverable in a matter of days, and that the deformation source is located in the shallow crust. This source is also degassing through the relatively open magmatic conduit. There are, however, several remaining questions. Although the volcano is not spreading continuously there is the possibility that gravitational spreading may be taking place in a stick-slip fashion. This has important implications for slope stability of the volcano, and the
associated hazards. The factors influencing the long term slope stability of the volcano are still not fully resolved, but internal volcanic processes and anthropogenic disturbances appear to be the major factors.
|
224 |
Two Essays on Stock Repurchases-The Post Repurchase Announcement Drift: An Anomaly in Disguise? and Intra Industry Effects of IPOs on Stock Repurchase DecisionsNguyen, Thanh Thiet 01 January 2013 (has links)
We reexamine the stock price drifts following open-market stock repurchase announcements by differentiating actual repurchases from repurchase announcements and by controlling for the repurchasing firms' earnings improvement in the announcement year relative to the prior year. Our results show that only firms that actually repurchase their shares exhibit a positive post-announcement drift. More importantly, we find that these repurchasing firms have the same post-announcement drift as their matching firms that have similar size and earnings performance but do not repurchase. Further analysis indicates that the post-repurchase announcement drift is not a distinct anomaly but the well-documented post-earnings announcement drift in disguise. In addition, previous studies suggest that the market perceives IPOs as bad news (i.e., competitive threats) to existing firms in the same industry. At the same time, the market has a tendency to be overly optimistic about IPO prospects, especially during hot IPO markets. Thus, the negative industry rival reaction could be the result of investors' over-optimism toward the IPOs' growth prospects and underestimation of the competitive positions of industry rivals. Our findings show that rival firms use repurchases as a means to signal their firm quality, as well as to correct the market's overreaction to the bad news. These IPO-induced repurchases are stronger when the rival firms are in a concentrated industry and experienced poor stock performance in the previous year.
|
225 |
Coding-Based System Primitives for Airborne Cloud ComputingLin, Chit-Kwan January 2011 (has links)
The recent proliferation of sensors in inhospitable environments such as disaster or battle zones has not been matched by in situ data processing capabilities due to a lack of computing infrastructure in the field. We envision a solution based on small, low-altitude unmanned aerial vehicles (UAVs) that can deploy elastically-scalable computing infrastructure anywhere, at any time. This airborne compute cloud—essentially, micro-data centers hosted on UAVs—would communicate with terrestrial assets over a bandwidth-constrained wireless network with variable, unpredictable link qualities. Achieving high performance over this ground-to-air mobile radio channel thus requires making full and efficient use of every single transmission opportunity. To this end, this dissertation presents two system primitives that improve throughput and reduce network overhead by using recent distributed coding methods to exploit natural properties of the airborne environment (i.e., antenna beam diversity and anomaly sparsity). We first built and deployed an UAV wireless networking testbed and used it to characterize the ground-to-UAV wireless channel. Our flight experiments revealed that antenna beam diversity from using multiple SISO radios boosts reception range and aggregate throughput. This observation led us to develop our first primitive: ground-to-UAV bulk data transport. We designed and implemented FlowCode, a reliable link layer for uplink data transport that uses network coding to harness antenna beam diversity gains. Via flight experiments, we show that FlowCode can boost reception range and TCP throughput as much as 4.5-fold. Our second primitive permits low-overhead cloud status monitoring. We designed CloudSense, a network switch that compresses cloud status streams in-network via compressive sensing. CloudSense is particularly useful for anomaly detection tasks requiring global relative comparisons (e.g., MapReduce straggler detection) and can achieve up to 16.3-fold compression as well as early detection of the worst anomalies. Our efforts have also shed light on the close relationship between network coding and compressive sensing. Thus, we offer FlowCode and CloudSense not only as first steps toward the airborne compute cloud, but also as exemplars of two classes of applications—approximation intolerant and tolerant—to which network coding and compressive sensing should be judiciously and selectively applied. / Engineering and Applied Sciences
|
226 |
Behavioral Finance : Kan ökad medvetenhet om marknadspsykologi förbättra kvalitén vid aktiemarknadsanalys och investeringsbeslut? / Behavioral Finance : Can increased awareness of market psychology improve the quality of stock market analysis and investment decisions?Levinsson, Jimmy, Molin, Johan January 2010 (has links)
Den finansiella utbildningen präglas av klassisk finansteori som förutsätter att den finansiella marknaden prissätts rationellt. Det finns dock ett gap mellan klassisk finansteori och verklighet. Syftet har därför varit att se hur en investerare genom ökad medvetenhet om dessa anomalier kan förbättra aktiemarknadsanalys och investeringsbeslut. Studien har genomförts med ett kvalitativt tillvägagångssätt och baserats på en litteraturstudie som kompletterats med intervjuer. Under studien har en bild av investeraren som begränsat rationell framträtt i linje med de teorier som har redovisats. Där flockbeteende vuxit fram som det mest påtagliga stödet för att den klassiska finansteorin inte är att likställa med marknadens dynamiska verklighet. I studien har teorierna inom behavioral finance tematiserats och redogjorts för mot bakgrund av det empiriska underlaget. Gemensamt är att investerare tenderar att vara begränsat rationella. Psykologin är ständigt närvarande i marknaden och påverkar investerare i deras beslutsfattande i större utsträckning än vad klassisk finansteori ger utrymme för. Detta är ett av de främsta skälen till varför behavioral finance och dess teorier borde bli ett komplement till den klassiska finansteorin. Slutsatsen är att det finns möjligheter för investerare att förbättra aktiemarknadsanalys och investeringsbeslut genom att ta teorierna inom behavioral finance i beaktning. / The financial education is characterized by classical financial theory that assumes that the fi-nancial market is priced rationally. However, there is a gap between classic finance theory and reality. The aim has been to see how an investor through increased awareness of these anomalies can improve stock market analysis and investment decisions. The study was conducted with a qualitative approach and was based on a literature review supplemented by interviews. During the study, proofs of semi-rational investors have emerged in line with the theories of behavioral finance. Herd behavior has emerged as the most tangible proof that the classical financial theory is not comparable to the dynamic reality of the market. In the study, theories of behavioral finance has been thematised and explained in the light of the empirical basis. In common for those theories is that investors tend to be semi-rational. The psychology is always present in the market and affects investors in their decision making to a greater extent than classic finance theories allow. It is one of the main reasons why it should be implemented as a complement to the traditional financial theories. The conclusion is that there is potential for investors to improve stock market analysis and investment decisions by taking theories of behavioral finance into consideration.
|
227 |
Transmission Properties of Sub-Wavelength Metallic Slits and Their ApplicationsXie, Yong January 2006 (has links)
With the manufacture of nano-scale features in the last ten years, it is possible to do optical experiments on features as small as a tenth/hundredth wavelength. It turns out that the experimental data cannot be explained by classical diffraction theories. Thus, it is necessary to develop new methods or use existing approaches which are effective in other fields, to solve problems in photonics. We use finite difference time domain (FDTD), to study transmission properties of sub-wavelength slits in a metallic film. By doing simulations on periodic and single slits, we confirm that the TE mode has a cutoff while a TM mode always has a propagating mode in the small apertures. Then we find that the transmittance is minimum when the array period is equal to the wavelength of surface plasmon polariton (SPP) at normal incidence. In fact, the SPP-like waves exist in both periodic and isolated slits, and they help the transmittance of small apertures. In order to establish the role of SPP in the transmission mechanism, it is necessary to single out each mode from the total fields. We developed Bloch mode method (BMM) to calculate the amplitudes of the lowest N orders, and the amplitudes tell us which one is dominant (not including the guided mode) at high and low transmission. BMM converges very fast and it is more accurate than FDTD since it does not suffer from numerical dispersion. Both methods can resolve the Wood anomaly and SPP anomaly; however, FDTD converges very slowly at the SPP resonance and oscillates around the value obtained through BMM at the Wood anomaly. BMM is not sensitive to material types, incident angles, and anomalies; it will be a useful tool to investigate similar problems.
|
228 |
EV/EBITDA : är det supermultipeln som kan generera överavkastning? / EV/EBITDA : is it the super multiple which can generate excess return?Karlsson, Sandra, Najafi, Anna-Maria January 2011 (has links)
Bakgrund: Effektiva marknadshypotesen innebär att det inte går att utnyttja systematiska avvikelser på marknaden. Trots det finns det etablerade investeringsstrategier som investerare använder sig av för att generera överavkastning. Syfte: Syftet med studien är att undersöka huruvida det går att generar överavkastning genom att investera i företag som uppvisar en låg eller hög EV/EBITDA-multipel. Variablerna bransch och risk kommer även att undersökas med utgångspunkt från den eventuella förekomsten av en investeringsstrategi som genererar överavkastning. Genomförande: Teori inom området har byggt upp en grundförståelse för problemet, empiri har sedan hämtats från de olika företagen för att få fram EV/EBITDA-multiplar till de ingående portföljerna i studien. Aktiekurser har även inhämtats för att bygga grunden till empirin. Resultatet har sedan jämförts med OMXSPI samt med den riskjusterade avkastningen. Resultat: Av resultatet framkommer att det går att utnyttja en investeringsstrategi där investering görs i låga EV/EBITDA-multiplar. Effekten är tydlig på hela Stockholmsbörsen, samt i två av de undersökta branscherna, sällanköpsvaror och tillverkningsindustrin. Det spelar ingen roll vilken riskpreferens investeraren har då portföljer med låga multiplar genererar högst avkastning och även innehåller lägst betavärde. / Background: The efficient market hypothesis alleges that an investor cannot systematically earn excess return. However there are established investment strategies that are being used in the stock market to obtain this excess return Aim: The aim of the thesis is to examine if it is possible to earn excess returns by investing in companies that indicate a low or high EV/EBITDA multiple. The variables industry and risk will also be examined with the possible presence of an investment strategy where excess return can be obtained as a base. Completion: Theory within the field has built an understanding of the problem, empirics have then been gathered to obtain EV/EBITDA multiples and stock prices to perform the study. Portfolios of high and low multiples have been composed to analyze the result. The result has been compared to the development of the OMXSPI index and to the risk adjusted return for the portfolios. Result: The result shows that an investment strategy where investments are made in a low EV/EBITDA-multiple can be used to earn excess return on the Swedish stock market. The effect is most present at the Stockholm stock exchange and in the manufacturing industry and industry for durable goods. The investors risk aversion does not affect the decision, thus low multiples generate the highest return with the lowest beta value.
|
229 |
Identifikation av icke-representativa svar i frågeundersökningar genom detektion av multivariata avvikareGalvenius, Hugo January 2014 (has links)
To United Minds, large-scale surveys are an important offering to clients, not least the public opinion poll Väljarbarometern. A risk associated with surveys is satisficing – sub-optimal response behaviour impairing the possibility of correctly describing the sampled population through its results. The purpose of this study is to – through the use of multivariate outlier detection methods - identify those observations assumed to be non-representative of the population. The possibility of categorizing responses generated through satisficing as outliers is investigated. With regards to the character of the Väljarbarometern dataset, three existing algorithms are adapted to detect these outliers. Also, a number of randomly generated observations are added to the data, by all algorithms correctly labelled as outliers. The resulting anomaly scores generated by each algorithm are compared, concluding the Otey algorithm as the most effective for the purpose, above all since it takes into account correlation between variables. A plausible cut-off value for outliers and separation between non-representative and representative outliers are discussed. The resulting recommendation is to handle observations labelled as outliers through respondent follow-up or if not possible, through downweighting, inversely proportional to the anomaly scores.
|
230 |
Kompiuterių tinklo srautų anomalijų aptikimo metodai / Detection of network traffic anomaliesKrakauskas, Vytautas 03 June 2006 (has links)
This paper describes various network monitoring technologies and anomaly detection methods. NetFlow were chosen for anomaly detection system being developed. Anomalies are detected using a deviation value. After evaluating quality of developed system, new enhancements were suggested and implemented. Flow data distribution was suggested, to achieve more precise NetFlow data representation, enabling a more precise network monitoring information usage for anomaly detection. Arithmetic average calculations were replaced with more flexible Exponential Weighted Moving Average algorithm. Deviation weight was introduced to reduce false alarms. Results from experiment with real life data showed that proposed changes increased precision of NetFlow based anomaly detection system.
|
Page generated in 0.0307 seconds