• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 874
  • 412
  • 156
  • 84
  • 79
  • 35
  • 27
  • 18
  • 17
  • 16
  • 14
  • 13
  • 10
  • 8
  • 8
  • Tagged with
  • 2105
  • 2105
  • 548
  • 431
  • 430
  • 382
  • 380
  • 204
  • 192
  • 167
  • 162
  • 160
  • 156
  • 148
  • 147
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

Statistical Analysis of Structured High-dimensional Data

Sun, Yizhi 05 October 2018 (has links)
High-dimensional data such as multi-modal neuroimaging data and large-scale networks carry excessive amount of information, and can be used to test various scientific hypotheses or discover important patterns in complicated systems. While considerable efforts have been made to analyze high-dimensional data, existing approaches often rely on simple summaries which could miss important information, and many challenges on modeling complex structures in data remain unaddressed. In this proposal, we focus on analyzing structured high-dimensional data, including functional data with important local regions and network data with community structures. The first part of this dissertation concerns the detection of ``important'' regions in functional data. We propose a novel Bayesian approach that enables region selection in the functional data regression framework. The selection of regions is achieved through encouraging sparse estimation of the regression coefficient, where nonzero regions correspond to regions that are selected. To achieve sparse estimation, we adopt compactly supported and potentially over-complete basis to capture local features of the regression coefficient function, and assume a spike-slab prior to the coefficients of the bases functions. To encourage continuous shrinkage of nearby regions, we assume an Ising hyper-prior which takes into account the neighboring structure of the bases functions. This neighboring structure is represented by an undirected graph. We perform posterior sampling through Markov chain Monte Carlo algorithms. The practical performance of the proposed approach is demonstrated through simulations as well as near-infrared and sonar data. The second part of this dissertation focuses on constructing diversified portfolios using stock return data in the Center for Research in Security Prices (CRSP) database maintained by the University of Chicago. Diversification is a risk management strategy that involves mixing a variety of financial assets in a portfolio. This strategy helps reduce the overall risk of the investment and improve performance of the portfolio. To construct portfolios that effectively diversify risks, we first construct a co-movement network using the correlations between stock returns over a training time period. Correlation characterizes the synchrony among stock returns thus helps us understand whether two or multiple stocks have common risk attributes. Based on the co-movement network, we apply multiple network community detection algorithms to detect groups of stocks with common co-movement patterns. Stocks within the same community tend to be highly correlated, while stocks across different communities tend to be less correlated. A portfolio is then constructed by selecting stocks from different communities. The average return of the constructed portfolio over a testing time period is finally compared with the SandP 500 market index. Our constructed portfolios demonstrate outstanding performance during a non-crisis period (2004-2006) and good performance during a financial crisis period (2008-2010). / PHD / High dimensional data, which are composed by data points with a tremendous number of features (a.k.a. attributes, independent variables, explanatory variables), brings challenges to statistical analysis due to their “high-dimensionality” and complicated structure. In this dissertation work, I consider two types of high-dimension data. The first type is functional data in which each observation is a function. The second type is network data whose internal structure can be described as a network. I aim to detect “important” regions in functional data by using a novel statistical model, and I treat stock market data as network data to construct quality portfolios efficiently
262

Acoustic emission monitoring of fiber reinforced bridge panels

Flannigan, James Christopher January 1900 (has links)
Master of Science / Department of Mechanical and Nuclear Engineering / Youqi Wang / Two fiber reinforced polymer (FRP) bridge deck specimens were analyzed by means of acoustic emission (AE) monitoring during a series of loading cycles performed at various locations on the composite sandwich panels' surfaces. These panels were subjected to loads that were intended to test their structural response and characteristics without exposing them to a failure scenario. This allowed the sensors to record multiple data sets without fear of having to be placed on multiple panels that could have various characteristics that alter the signals recorded. The objective throughout the analysis ias to determine how the acoustic signals respond to loading cycles and various events can affect the acoustical data. In the process of performing this examination several steps were taken including threshold application, data collection, and sensor location analysis. The thresholds are important for lowering the size of the files containing the data, while keeping important information that could determine structurally significant information. Equally important is figuring out where and how the sensors should be placed on the panels in the first place in relation to other sensors, panel features and supporting beams. The data was subjected to analysis involving the response to applied loads, joint effects and failure analysis. Using previously developed techniques the information gathered was also analyzed in terms of what type of failure could be occurring within the structure itself. This somewhat aided in the analysis after an unplanned failure event occurred to determine what cause or causes might have lead to the occurrence. The basic analyses were separated into four sets, starting with the basic analysis to determine basic correlations to the loads applied. This was followed by joint and sensor location analyses, both of which took place using a two panel setup. The last set was created upon matrix failure of the panel and the subsequent investigation.
263

The Development and the Evaluation of a Quasi-Real Time Decision Aid Tool

Leite, Nelson Paiva Oliveira, Lopes, Leonardo Mauricio de Faria, Walter, Fernando 10 1900 (has links)
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada / In an experimental flight test campaign, the usage of a real time Ground Telemetry System (GTS) provides mandatory support for three basic essential services: a) Safety storage of Flight Tests Instrumentation (FTI) data, in the occurrence of a critical aircraft failure; b) Monitoring of critical flight safety parameters to avoid the occurrence of accidents; and c) Monitoring of selected parameters that validates all tests points. At the operational side the test ranges typically works in two phases: a) In real time where the GTS crew performs test validation and test point selection with Telemetry data; and b) In post mission where the engineering crew performs data analysis and reduction with airborne recorded data. This process is time consuming because recorded data has to be downloaded, converted to Engineering Units (EU), sliced, filtered and processed. The main reason for the usage of this less efficient process relies in the fact that the real time Telemetry data is less reliable as compared to recorded data (i.e. it contains more noise and some dropouts). With the introduction of new technologies (i.e. i-NET) the telemetry link could be very reliable, so the GTS could perform data reduction analysis immediately after the receipt of all valid tests points, while the aircraft is still flying in a quasi-real time environment. To achieve this goal the Brazilian Flight Test Group (GEEV) along with EMBRAER and with the support of Financiadora de Estudos e Projetos (FINEP) started the development of a series of Decision Aid Tools that performs data reduction analysis into the GTS in quasi-real time. This paper presents the development and the evaluation of a tool used in Air Data System Calibration Flight Tests Campaign. The application receives the Telemetry data over either a TCP/IP or a SCRAMnet Network, performs data analysis and test point validation in real time and when all points are gathered it performs the data reduction analysis and automatically creates HTML formatted tests reports. The tool evaluation was carried out with the instruction flights for the 2009 Brazilian Flight Test School (CEV). The results present a great efficiency gain for the overall Flight Test Campaign.
264

Designing an Object-Oriented Data Processing Network

Yang, Hsueh-szu, Sadia, Nathan, Kupferschmidt, Benjamin 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / There are many challenging aspects to processing data from a modern high-performance data acquisition system. The sheer diversity of data formats and protocols makes it very difficult to create a data processing application that can properly decode and display all types of data. Many different tools need to be harnessed to process and display all types of data. Each type of data needs to be displayed on the correct type of display. In particular, it is very hard to synchronize the display of different types of data. This tends to be an error prone, complex and very time-consuming process. This paper discusses a solution to the problem of decoding and displaying many different types of data in the same system. This solution is based on the concept of a linked network of data processing nodes. Each node performs a particular task in the data decoding and/or analysis process. By chaining these nodes together in the proper sequence, we can define a complex decoder from a set of simple building blocks. This greatly increases the flexibility of the data visualization system while allowing for extensive code reuse.
265

An investigation of the effect of the European currency union (Euro) on sectoral trade : an application of the gravity model of trade

Awa, Ruth January 2015 (has links)
The introduction of the single currency (Euro) in Europe has been referred to as the ‘world’s largest economic experiment’ and has led to major research on the effects of the adoption of a common currency on economic activity with considerable emphasis on its effect on trade flows at the macroeconomic level. However, the investigation of the euro effect on individual sectors has received very little attention and this provides the motivation for the research. The main contribution of this thesis is to the sectoral analysis of the single currency’s effect on bi-lateral trade flows, specifically the effects on the transport equipment manufacturing sector. In order to achieve this, a comparison of the different estimation methods applied in the gravity model literature will be employed to investigate this effect and to identify the factors affecting trade in this sector. This study uses a panel data set which comprises the most recent information on bilateral trade for the EU15 countries from 1990 to 2008. This research aims to build on the results obtained in previous studies by employing a more refined empirical methodology and associated tests. The purpose of the tests is to ensure that the euro’s effect on trade is isolated from the other pro- trade policies of the European integration processes, particularly the introduction of the Single Market. The desirable feature of this approach is that, while other studies limit their attention to a particular issue (zero trade flow, time trend, sectoral analysis, cross-correlation, etc.), very few, if any, apply a selection of techniques. Overall, the results demonstrate that the single currency’s effect on trade in this sector is limited with only the fixed effects formulation with year dummy variables showing a significant positive effect of the euro. An obvious policy implication for countries looking to adopt a single currency is that they should be cautious regarding the potential for growth in intra-bloc trade in a particular sector, although they will benefit from the on-going process of integration.
266

A novel platform incorporating multiple forms of communication to support applications in a mobile environment

Elton, James January 2014 (has links)
This thesis discusses the creation of a novel platform that incorporates multiple communication methods, including SMS, email and web-based technologies, for interacting with users of mobile communication devices. The platform utilises people in a mobile environment to solve a range of different application problems, where each problem is a separate and distinct scenario type with unique objectives. There are existing applications available that interact with users of mobile communication devices to provide a service, such as regular weather updates to the users. Other applications have been designed to manage and coordinate the users to perform tasks within a mobile environment, such as performing field studies for scientific purposes. However, the existing applications are designed for only one specific scenario, with the design and implementation solely focused on solving the objectives of that scenario. Each component of these applications needs to be developed from scratch in order to cater for the application s requirements. There is currently no integrated communications platform that offers a framework for supporting a range of different scenario types. The new platform, entitled the Connected-Mobile Platform, aims to support the rapid development and implementation of new scenarios. This platform is composed of a framework of generic components that enable the active running of multiple scenarios concurrently, with the ability to tailor to the requirements of new scenarios as they arise via a structured process. The platform facilitates a means to coordinate its users in order to tackle the objectives of a scenario. The thesis investigates several system architectures to determine an appropriate architectural design for constructing the proposed platform. The platform has a generic framework, based on a client-server architecture, to facilitate the inclusion of a multitude of scenarios. A scenario represents a problem or an event, whereby the platform can utilise and interact with users of mobile communication devices to attempt to solve the objectives of the scenario. Three mobile communication methods are supported; the Short Message Service, electronic mail and web-forms via the mobile internet. Users are able to select and switch between the different methods. The thesis describes the platform s tailored communication structure for scenarios and autonomous analysis of messages. The thesis discusses case studies of two different scenarios to evaluate the platform s facilities for rapid scenario development. The Diet Diary scenario, which is for individual users, aims to manage a user s daily calorie intake to help them reach their desired weight goal. The focus is on the platform s functionality for analysing and responding to messages autonomously. The Missing Persons scenario, which utilises multiple users, involves tracking and locating people who have been reported missing. The focus is on the platform s functionality for coordinating the multiple users, through the creation of assignments, in order to distribute the scenario objectives. The thesis concludes by highlighting the novel features of the platform and identifying opportunities for future work.
267

Galactic microlensing : binary-lens light curve morphologies and results from the Rosetta spacecraft bulge survey

Liebig, Christine Elisabeth January 2014 (has links)
For 20 years now, gravitational microlensing observations towards the Galactic bulge have provided us with a wealth of information about the stellar and planetary content of our Galaxy, which is inaccessible via other current methods. This thesis summarises work on two research topics that arose in the context of exoplanetary microlensing, but we take a step back and consider ways of increasing our understanding of more fundamental phenomena: firstly, stellar microlenses in our Galaxy that were stereoscopically observed and, secondly, the morphological variety of binary-lens light curves. In autumn 2008, the ESA Rosetta spacecraft surveyed the Galactic bulge for microlensing events. With a baseline of ∼1.6 AU between the spacecraft and ground observations, significant parallax effects can be expected. We develop a photometry pipeline to deal with a severely undersampled point spread function in the crowded fields of the Galactic bulge, making use of complementary ground observations. Comparison of Rosetta and OGLE light curves provides the microlens parallax π[subscript{E}] , which constrains the mass and distance of the observed lenses. The lens mass could be fully determined if future proper motion measurements were obtained, whereas the lens distance additionally requires the determination of the source distance. In the second project, we present a detailed study of microlensing light curve morphologies. We provide a complete morphological classification for the case of the equal-mass binary lens, which makes use of the realisation that any microlensing peak can be categorised as one of only four types: cusp-grazing, cusp-crossing, fold-crossing or fold-grazing. As a means for this classification, we develop a caustic feature notation, which can be universally applied to binary lens caustics. Ultimately, this study aims to refine light curve modelling approaches by providing an optimal choice of initial parameter sets, while ensuring complete coverage of the relevant parameter space.
268

MULTIPROCESSOR-BASED DATA ACQUISITION AND ANALYSIS

Lynch, Thomas J., III, Fortmann, Thomas E., Briscoe, Howard, Fidell, Sanford 11 1900 (has links)
International Telemetering Conference Proceedings / October 30-November 02, 1989 / Town & Country Hotel & Convention Center, San Diego, California / Multiprocessing computer systems offer several attractive advantages for telemetry-related data acquisition and processing applications. These include: (1) high-bandwidth, fail-soft operation with convenient, low-cost, growth paths, (2) cost-effective integration and clustering of data acquisition, decommutation, monitoring, archiving, analysis, and display processing, and (3) support for modern telemetry system architectures that allow concurrent network access to test data (for both real-time and post-test analyses) by multiple analysts. This paper asserts that today’s general-purpose hardware and software offer viable platforms for these applications. One such system, currently under development, closely couples VME data buses and other off-the-shelf components, parallel processing computers, and commercial data analysis packages to acquire, process, display, and analyze telemetry and other data from a major weapon system. This approach blurs the formerly clear architectural distinction in telemetry data processing systems between special-purpose, front-end, preprocessing hardware and generalpurpose, back-end, host computers used for further processing and display.
269

Measuring subhalo mass in redMaPPer clusters with CFHT Stripe 82 Survey

Li, Ran, Shan, Huanyuan, Kneib, Jean-Paul, Mo, Houjun, Rozo, Eduardo, Leauthaud, Alexie, Moustakas, John, Xie, Lizhi, Erben, Thomas, Van Waerbeke, Ludovic, Makler, Martin, Rykoff, Eli, Moraes, Bruno 21 May 2016 (has links)
We use the shear catalogue from the CFHT Stripe-82 Survey to measure the subhalo masses of satellite galaxies in redMaPPer clusters. Assuming a Chabrier initial mass function and a truncated NFW model for the subhalo mass distribution, we find that the subhalo mass to galaxy stellar mass ratio increases as a function of projected halo-centric radius r(p), from M-sub/M-star = 4.43(-2.23)(+6.63) at r(p) is an element of [0.1, 0.3] h(-1) Mpc toM(sub)/M-star = 75.40(-19.09)(+19.73) at r(p) is an element of [0.6, 0.9] h(-1) Mpc. We also investigate the dependence of subhalo masses on stellar mass by splitting satellite galaxies into two stellar mass bins: 10 < log (M-star/h(-1) M-circle dot) < 10.5 and 11 < log (M-star/h(-1) M-circle dot) < 12. The best-fitting subhalomass of the more massive satellite galaxy bin is larger than that of the lessmassive satellites: log(M-sub/h(-1) M-circle dot) = 11.14(-0.73)(+0.66) (M-sub/M-star = 19.5(-17.9)(+19.8)) versus log(M-sub/h(-1) M-circle dot) = 12.38(-0.16)(+0.16) (M-sub/M-star = 21.1(-7.7)(+7.4)).
270

Island biogeography of young land uplift islands - viewed through the lens of bryophytes in a northern Swedish archipelago / Öbiogeografi hos unga landhöjningsöar - betraktad ur ett mossperspektiv.

Karlsson Tiselius, Andreas January 2016 (has links)
Increasing habitat fragmentation and rapid global warming is changing the conditions for species populations and ecological communities around the world. This presents challenges for the maintenance of biodiversity and a dominant paradigm for conservation in fragmented habitats is given by island biogeography and metapopulation (or metacommunity) ecology. In this thesis I approach key concepts (area, connectivity and community assembly) in island biogeography and metacommunity ecology within the context of a dynamic land uplift archipelago. The presented work consists of two interwoven themes: (i) A methodological theme in which statistical approaches are developed to deal with the complexities of multispecies dynamic systems, and (ii) an applied theme dealing with community assembly and island biogeography of bryophytes on young land uplift islands. To describe island connectivity for entire species assemblages, an approach using functional principal component analysis (fPCA) on patch connectivity functions (the connectivity of an island as a continuous function of a variable representing the spatial scale of species dispersal capacities) was developed. In addition, a new statistical method, functional co-inertia analysis (fCoIA), for analyzing co-variation between multivariate species data and continuous functions was developed and applied to the relation between bryophyte species incidences and the island age/area-dynamics. Primarily asexual bryophyte species are dispersal limited and presence probabilities are related to island connectivity. No such patterns were found for species, at least occasionally, producing spores. Our results suggest that bryophyte dispersal is regulated by the contribution of spores to a regional spore rain and that bryophyte species with low spore output at the landscape level may be extra vulnerable under habitat fragmentation and loss. Having specialized asexual propagules increases the presence probabilities on islands, partly compensating for the dispersal limitation in asexual species. This suggests a trade-off between dispersal and establishment capacity, but also points to the importance of local dispersal for maintaining populations under the succession driven spatial turnover of microsites on the islands. Bryophyte colonization is strongly limited by habitat availability when a given habitats is rare, but there seems to exist a threshold over which other processes (e.g. dispersal limitation) become more important. Species with more vagile life history strategies appear to be stronger affected by the area of available habitats than many perennial species

Page generated in 0.0689 seconds