Spelling suggestions: "subject:"dataanalysis"" "subject:"data.analysis""
181 |
Texture and Microstructure in Two-Phase Titanium AlloysMandal, Sudipto 01 August 2017 (has links)
This work explores the processing-microstructure-property relationships in two-phase titanium alloys such as Ti-6Al-4V and Ti-5Al-5V-5Mo-3Cr that are used for aerospace applications. For this purpose, an Integrated Computational Materials Engineering approach is used. Microstructure and texture of titanium alloys are characterized using optical microscopy, electron backscatter diffraction and x-ray diffraction. To model their properties, threedimensional synthetic digital microstructures are generated based on experimental characterization data. An open source software package, DREAM.3D, is used to create heterogeneous two-phase microstructures that are statistically representative of two-phase titanium alloys. Both mean-field and full-field crystal plasticity models are used for simulating uniaxial compression at different loading conditions. A viscoplastic self-consistent model is used to match the stress-strain response of the Ti-5553 alloy based on uniaxial compression tests. A physically-based Mechanical Threshold Stress (MTS) model is designed to cover wide ranges of deformation conditions. Uncertainties in the parameters of the MTS model are quantified using canonical correlation analysis, a multivariate global sensitivity analysis technique. An elastoviscoplastic full-field model based on the fast Fourier transform algorithm was used to used to simulate the deformation response at both microscopic and continuum level. The probability distribution of stresses and strains for both the phases in the two-phase material is examined statistically. The effect of changing HCP phase volume fraction and morphology has been explored with the intent of explaining the ow softening behavior in titanium alloys.
|
182 |
Impact of Visualization on Engineers – A SurveyShah, Dhaval Kashyap 29 June 2016 (has links)
In the recent years, there has been a tremendous growth in data. Numerous research and technologies have been proposed and developed in the field of Visualization to cope with the associated data analytics. Despite these new technologies, the pace of people’s capacity to perform data analysis has not kept pace with the requirement. Past literature has hinted as to various reasons behind this disparity. The purpose of this research is to demonstrate specifically the usage of Visualization in the field of engineering. We conducted the research with the help of a survey identifying the places where Visualization educational shortcomings may exist. We conclude by asserting that there is a need for creating awareness and formal education about Visualization for Engineers.
|
183 |
Visual exploratory analysis of large data sets : evaluation and applicationLam, Heidi Lap Mun 11 1900 (has links)
Large data sets are difficult to analyze. Visualization has been proposed to assist exploratory data analysis (EDA) as our visual systems can process signals in
parallel to quickly detect patterns. Nonetheless, designing an effective visual
analytic tool remains a challenge.
This challenge is partly due to our incomplete understanding of how common
visualization techniques are used by human operators during analyses, either in
laboratory settings or in the workplace.
This thesis aims to further understand how visualizations can be used to support EDA. More specifically, we studied techniques that display multiple levels of visual information resolutions (VIRs) for analyses using a range of methods.
The first study is a summary synthesis conducted to obtain a snapshot of
knowledge in multiple-VIR use and to identify research questions for the thesis:
(1) low-VIR use and creation; (2) spatial arrangements of VIRs. The next two
studies are laboratory studies to investigate the visual memory cost of image
transformations frequently used to create low-VIR displays and overview use
with single-level data displayed in multiple-VIR interfaces.
For a more well-rounded evaluation, we needed to study these techniques in
ecologically-valid settings. We therefore selected the application domain of web
session log analysis and applied our knowledge from our first three evaluations
to build a tool called Session Viewer. Taking the multiple coordinated view
and overview + detail approaches, Session Viewer displays multiple levels of
web session log data and multiple views of session populations to facilitate data
analysis from the high-level statistical to the low-level detailed session analysis
approaches.
Our fourth and last study for this thesis is a field evaluation conducted at
Google Inc. with seven session analysts using Session Viewer to analyze their
own data with their own tasks. Study observations suggested that displaying
web session logs at multiple levels using the overview + detail technique helped bridge between high-level statistical and low-level detailed session analyses, and
the simultaneous display of multiple session populations at all data levels using
multiple views allowed quick comparisons between session populations. We also
identified design and deployment considerations to meet the needs of diverse
data sources and analysis styles. / Science, Faculty of / Computer Science, Department of / Graduate
|
184 |
Characterization and mitigation of radiation damage on the Gaia Astrometric FieldBrown, Scott William January 2011 (has links)
In November 2012, the European Space Agency (ESA) is planning to launch Gaia, a mission designed to measure with microarcsecond accuracy the astrometric properties of over a billion stars. Microarcsecond astrometry requires extremely accurate positional measurements of individual stellar transits on the focal plane, which can be disrupted by radiation-induced Charge Transfer Inefficiency (CTI). Gaia will suffer radiation damage, impacting on the science performance, which has led to a series of Radiation Campaigns (RCs) being carried out by industry to investigate these issues. The goal of this thesis is to rigorously assess these campaigns and facilitate how to deal with CTI in the data processing. We begin in Chapter 1 by giving an overview of astrometry and photometry, introducing the concept of stellar parallax, and establishing why observing from space is paramount for performing global, absolute astrometry. As demonstrated by Hipparcos, the concept is sound. After reviewing the Gaia payload and discussing how astrometric and photometric parameters are determined in practice, we introduce the issue of radiation-induced CTI and how it may be dealt with. The on board mitigating strategies are investigated in detail in Chapter 2. Here we analyse the effects of radiation damage as a function of magnitude with and without a diffuse optical background, charge injection and the use of gates, and also discover a number of calibration issues. Some of these issues are expected to be removed during flight testing, others will have to be dealt with as part of the data processing, e.g. CCD stitches and the charge injection tail. In Chapter 3 we turn to look at the physical properties of a Gaia CCD. Using data from RC2 we probe the density of traps (i.e. damaged sites) in each pixel and, for the first time, measure the Full Well Capacity of the Supplementary Buried Channel, a part of every Gaia pixel that constrains the passage of faint signals away from the bulk of traps throughout the rest of the pixel. The Data Processing and Analysis Consortium (DPAC) is currently adopting a 'forward modelling' approach to calibrate radiation damage in the data processing. This incorporates a Charge Distortion Model (CDM), which is investigated in Chapter 4. We find that although the CDM performs well there are a number of degeneracies in the model parameters, which may be probed further by better experimental data and a more realistic model. Another way of assessing the performance of a CDM is explored in Chapter 5. Using a Monte Carlo approach we test how well the CDM can extract accurate image parameters. It is found that the CDM must be highly robust to achieve a moderate degree of accuracyand that the fitting is limited by assigning finite window sizes to the image shapes. Finally, in Chapter 6 we summarise our findings on the campaign analyses, the on-board mitigating strategies and on how well we are currently able to handle radiation damage in the data processing.
|
185 |
Metody geomarketingu / Geomarketing methodsVoráč, Michal January 2014 (has links)
Aim of this application-oriented master's thesis is to prove a benefits from using data analysis techniques connected with geodata processing to support business decisions. As a conclusion two solutions are given which are more attractive then starting situation. Since the first solution proposed is oriented on giving maximum success ratio while considering transactions, the second one is oriented on business value of each transaction. In this thesis R programming language is widely used together with ArcGIS Online in its final part.
|
186 |
Možnosti využitia Business Intelligence nástrojov v cloude / Possibilities of using Business Intelligence tools in the cloudRoman, Martin January 2012 (has links)
Thesis is devoted on Business Intelligence tools in cloud, as one of the main trends of this area. BI tools in cloud have become widely used after companies begin to realize importance of data analysis for getting a competitive advantage. The high cost for implementation of a BI caused, that companies started discovering tools which can be implementing and operating with significantly lower costs and outside of their own infrastructure. The practical part of the thesis is focused on analysis and comparison currently using BI cloud tools. The analysis is based on practical examples which should test selected solutions for using at corporate sector of small and medium enterprises. Each of selected solutions is analyzed from several points of view and rated. Beside analysis is the next main goal of the thesis evaluation of best solution for companies and define potentials benefits and limitations which can follow implementation and operation.
|
187 |
Análise de dados utilizando a medida de tempo de consenso em redes complexas / Data anlysis using the consensus time measure for complex networksJean Pierre Huertas Lopez 30 March 2011 (has links)
Redes são representações poderosas para muitos sistemas complexos, onde vértices representam elementos do sistema e arestas representam conexões entre eles. Redes Complexas podem ser definidas como grafos de grande escala que possuem distribuição não trivial de conexões. Um tópico importante em redes complexas é a detecção de comunidades. Embora a detecção de comunidades tenha revelado bons resultados na análise de agrupamento de dados com grupos de diversos formatos, existem ainda algumas dificuldades na representação em rede de um conjunto de dados. Outro tópico recente é a caracterização de simplicidade em redes complexas. Existem poucos trabalhos nessa área, no entanto, o tema tem muita relevância, pois permite analisar a simplicidade da estrutura de conexões de uma região de vértices, ou de toda a rede. Além disso, mediante a análise de simplicidade de redes dinâmicas no tempo, é possível conhecer como vem se comportando a evolução da rede em termos de simplicidade. Considerando a rede como um sistema dinâmico de agentes acoplados, foi proposto neste trabalho uma medida de distância baseada no tempo de consenso na presença de um líder em uma rede acoplada. Utilizando essa medida de distância, foi proposto um método de detecção de comunidades para análise de agrupamento de dados, e um método de análise de simplicidade em redes complexas. Além disso, foi proposto uma técnica de construção de redes esparsas para agrupamento de dados. Os métodos têm sido testados com dados artificiais e reais, obtendo resultados promissores / Networks are powerful representations for many complex systems, where nodes represent elements of the system and edges represent connections between them. Complex networks can be defined as graphs with no trivial distribution of connections. An important topic in complex networks is the community detection. Although the community detection have reported good results in the data clustering analysis with groups of different formats, there are still some dificulties in the representation of a data set as a network. Another recent topic is the characterization of simplicity in complex networks. There are few studies reported in this area, however, the topic has much relevance, since it allows analyzing the simplicity of the structure of connections between nodes of a region or connections of the entire network. Moreover, by analyzing simplicity of dynamic networks in time, it is possible to know the behavior in the network evolution in terms of simplicity. Considering the network as a coupled dynamic system of agents, we proposed a distance measure based on the consensus time in the presence of a leader in a coupled network. Using this distance measure, we proposed a method for detecting communities to analyze data clustering, and a method for simplicity analysis in complex networks. Furthermore, we propose a technique to build sparse networks for data clustering. The methods have been tested with artificial and real data, obtaining promising results
|
188 |
Identifying the factors that affect the severity of vehicular crashes by driver ageTollefson, John Dietrich 01 December 2016 (has links)
Vehicular crashes are the leading cause of death for young adult drivers, however, very little life course research focuses on drivers in their 20s. Moreover, most data analyses of crash data are limited to simple correlation and regression analysis. This thesis proposes a data-driven approach and usage of machine-learning techniques to further enhance the quality of analysis.
We examine over 10 years of data from the Iowa Department of Transportation by transforming all the data into a format suitable for data analysis. From there, the ages of drivers present in the crash are discretized depending on the ages of drivers present for better analysis. In doing this, we hope to better discover the relationship between driver age and factors present in a given crash.
We use machine learning algorithms to determine important attributes for each age group with the goal of improving predictivity of individual methods. The general format of this thesis follows a Knowledge Discovery workflow, preprocessing and transforming the data into a usable state, from which we perform data mining to discover results and produce knowledge.
We hope to use this knowledge to improve the predictivity of different age groups of drivers with around 60 variables for most sets as well as 10 variables for some. We also explore future directions this data could be analyzed in.
|
189 |
A joint model of an internal time-dependent covariate and bivariate time-to-event data with an application to muscular dystrophy surveillance, tracking and research network dataLiu, Ke 01 December 2015 (has links)
Joint modeling of a single event time response with a longitudinal covariate dates back to the 1990s. The three basic types of joint modeling formulations are selection models, pattern mixture models and shared parameter models. The shared parameter models are most widely used. One type of a shared parameter model (Joint Model I) utilizes unobserved random effects to jointly model a longitudinal sub-model and a survival sub-model to assess the impact of an internal time-dependent covariate on the time-to-event response.
Motivated by the Muscular Dystrophy Surveillance, Tracking and Research Network (MD STARnet), we constructed a new model (Joint Model II), to jointly analyze correlated bivariate time-to-event responses associated with an internal time-dependent covariate in the Frequentist paradigm. This model exhibits two distinctive features: 1) a correlation between bivariate time-to-event responses and 2) a time-dependent internal covariate in both survival models. Developing a model that sufficiently accommodates both characteristics poses a challenge. To address this challenge, in addition to the random variables that account for the association between the time-to-event responses and the internal time-dependent covariate, a Gamma frailty random variable was used to account for the correlation between the two event time outcomes. To estimate the model parameters, we adopted the Expectation-Maximization (EM) algorithm. We built a complete joint likelihood function with respect to both latent variables and observed responses. The Gauss-Hermite quadrature method was employed to approximate the two-dimensional integrals in the E-step of the EM algorithm, and the maximum profile likelihood type of estimation method was implemented in the M-step. The bootstrap method was then applied to estimate the standard errors of the estimated model parameters. Simulation studies were conducted to examine the finite sample performance of the proposed methodology. Finally, the proposed method was applied to MD STARnet data to assess the impact of shortening fractions and steroid use on the onsets of scoliosis and mental health issues.
|
190 |
Panel data analysis of fuel price elasticities to vehicle-miles traveled for first year participants of the national evaluation of a mileage-based road user charge studyHatz, Charles Nicholas, II 01 July 2011 (has links)
The impact of fuel price changes can be seen in practically all sectors of the United States economy. Fuel prices directly and indirectly influence the daily life of most Americans. The national economy as well as the high standard of living we have come to enjoy in the United States is run on gasoline. Since the late 1990's the days of cheap oil and $1.00 gallons of gas are clearly over, understanding the influences of fuel price is more important now than ever. Since 1998 regular gasoline prices have increased $0.22 per gallon per year on average through the present with little evidence suggesting this trend will slow down or reverse substantially. The drastic and permanent change to the status quo of fuel prices has potentially rendered traditional knowledge of fuel price elasticities inapplicable to current analysis. Obtaining accurate measures of fuel price elasticities is important as it is used as a measure of personal mobility and can be related to the quality of life the public is experiencing. Price elasticities are also used in determining the future revenue available for surface transportation projects. Traditionally, short-run fuel price elasticities are thought to be inelastic allowing transportation agencies to ignore short-run fuel price changes to some degree when creating future projects and evaluating its economic feasibility. By using driving data collected from The National Evaluation of a Mileage-based Road User Study the fuel price elasticity of vehicle-miles traveled (VMT), as well as the sensitivity of gas prices relative to a historical high price, were estimated for the first year study participants using a panel data set approach with linear regression. The short-run fuel price elasticity of VMT was determined to be -1.71 with a range of -1.93 and -1.48. The elasticities found were significantly higher than the average short-run fuel price elasticity of -0.45 but can be rationalized by the impact poor economic conditions as well as the historically high fuel prices experienced prior to the researches time table had on the individuals driving behavior. The results suggest current short-run elasticities are not inelastic, if this trend continues transportation agencies must re-evaluate how they predict the future funding available for surface transportation projects.
|
Page generated in 0.0491 seconds