Spelling suggestions: "subject:"dataanalysis"" "subject:"data.analysis""
301 |
Delikvence mládeže a její hodnotové souvislosti / Juvenile delinquency and its moral aspectsPrůšová, Barbora January 2014 (has links)
This thesis is focused on analysis of youth delinquency in terms of Per-Olof H. Wikström's Situational Action Theory or rather modelling data relating to this area of research International Self-Report Delinquency Study 3. The main aim of the thesis is to introduce and evaluate this theoretical-empirical model for the explanation of youth delinquency. The work is split into three main parts - theoretical, methodological and empirical. First one consists of the definition of basic concepts and show Wikström' s 'situational action theory applied to the delinquency topic. In methodological part there is a description of ISRD-3 survey, basic indicators of sample and data collection methods used. And then there is an explanation how operationalization of individual explanatory variables in the model was done. Empirical part is dedicated to multidimensional analysis of data and evaluation of this concept. The results demonstrate the success of the analytical model and its application as a default theory in the examination of youth delinquency.
|
302 |
Riešenie problému globálnej optimalizácie využitím GPU / Employing GPUs in Global Optimization ProblemsHošala, Michal January 2014 (has links)
The global optimization problem -- i.e., the problem of finding global extreme points of given function on restricted domain of values -- often appears in many real-world applications. Improving efficiency of this task can reduce the latency of the application or provide more precise result since the task is usually solved by an approximative algorithm. This thesis focuses on the practical aspects of global optimization algorithms, especially in the domain of algorithmic trading data analysis. Successful implementations of the global optimization solver already exist for CPUs, but they are quite time demanding. The main objective of this thesis is to design a GO solver that utilizes the raw computational power of the GPU devices. Despite the fact that the GPUs have significantly more computational cores than the CPUs, the parallelization of a known serial algorithm is often quite challenging due to the specific execution model and the memory architecture constraints of the existing GPU architectures. Therefore, the thesis will explore multiple approaches to the problem and present their experimental results.
|
303 |
Determinants of Financial DevelopmentBzhalava, Eri January 2014 (has links)
Determinants of financial development Abstract The paper studies effects of country level determinants on the rate of financial development and, in particular, assesses the empirical question whether democracy and political freedom can enhance financial development, as measured by Bank Private Credit to GDP and Liquid Liabilities to GDP. Using Fixed Effects estimation techniques and a panel data for a list of 39 countries over the period 1990 to 2011, we provide evidence that suggests positive link between political openness and financial development. The empirical evidence also confirms financial openness and real per capita income to be positively correlated to financial deepening and in contrast, we find that size of financial sector does not spur the rate of financial development.
|
304 |
Delikvence mládeže a její hodnotové souvislosti / Juvenile delinquency and its moral aspectsPrůšová, Barbora January 2014 (has links)
This thesis is focused on analysis of youth delinquency in terms of Per-Olof H. Wikström's Situational Action Theory or rather modelling data relating to this area of research International Self-Report Delinquency Study 3. The main aim of the thesis is to introduce and evaluate this theoretical-empirical model for the explanation of youth delinquency. The work is split into three main parts - theoretical, methodological and empirical. First one consists of the definition of basic concepts and show Wikström' s 'situational action theory applied to the delinquency topic. In methodological part there is a description of ISRD-3 survey, basic indicators of sample and data collection methods used. In empirical part is an explanation how operationalization of individual explanatory variables in the model was done. This part is also dedicated to multidimensional analysis of data and evaluation of this concept. The results demonstrate the success of the analytical model and its application as a default theory in the examination of youth delinquency. Key words: youth delinquency, Situational Action Theory, multidimensional data analysis
|
305 |
Studium fotonových silových funkcí z termálního záchytu neutronů / Studium fotonových silových funkcí z termálního záchytu neutronůBauer, Karel January 2016 (has links)
This thesis deals with the description of $\gamma-$ray deexcitation of neutron resonances produced in thermal neutron capture below neutron separation energy. A subject of this thesis is obtaining information on absolute value of \textit{photon strength function} (PSF) achieved from primary transitions in thermal neutron capture. The aim is to map and bring new information on absolute value of photon strength function (PSF) in $^{156}$Gd and $^{158}$Gd. The method which was used in this thesis can lead to refusion of several models of PSF a level density. Powered by TCPDF (www.tcpdf.org)
|
306 |
Information visualisation and data analysis using web mash-up systemsKhan, Wajid January 2014 (has links)
The arrival of E-commerce systems have contributed greatly to the economy and have played a vital role in collecting a huge amount of transactional data. It is becoming difficult day by day to analyse business and consumer behaviour with the production of such a colossal volume of data. Enterprise 2.0 has the ability to store and create an enormous amount of transactional data; the purpose for which data was collected could quite easily be disassociated as the essential information goes unnoticed in large and complex data sets. The information overflow is a major contributor to the dilemma. In the current environment, where hardware systems have the ability to store such large volumes of data and the software systems have the capability of substantial data production, data exploration problems are on the rise. The problem is not with the production or storage of data but with the effectiveness of the systems and techniques where essential information could be retrieved from complex data sets in a comprehensive and logical approach as the data questions are asked. Using the existing information retrieval systems and visualisation tools, the more specific questions are asked, the more definitive and unambiguous are the visualised results that could be attained, but when it comes to complex and large data sets there are no elementary or simple questions. Therefore a profound information visualisation model and system is required to analyse complex data sets through data analysis and information visualisation, to make it possible for the decision makers to identify the expected and discover the unexpected. In order to address complex data problems, a comprehensive and robust visualisation model and system is introduced. The visualisation model consists of four major layers, (i) acquisition and data analysis, (ii) data representation, (iii) user and computer interaction and (iv) results repositories. There are major contributions in all four layers but particularly in data acquisition and data representation. Multiple attribute and dimensional data visualisation techniques are identified in Enterprise 2.0 and Web 2.0 environment. Transactional tagging and linked data are unearthed which is a novel contribution in information visualisation. The visualisation model and system is first realised as a tangible software system, which is then validated through different and large types of data sets in three experiments. The first experiment is based on the large Royal Mail postcode data set. The second experiment is based on a large transactional data set in an enterprise environment while the same data set is processed in a non-enterprise environment. The system interaction facilitated through new mashup techniques enables users to interact more fluently with data and the representation layer. The results are exported into various reusable formats and retrieved for further comparison and analysis purposes. The information visualisation model introduced in this research is a compact process for any size and type of data set which is a major contribution in information visualisation and data analysis. Advanced data representation techniques are employed using various web mashup technologies. New visualisation techniques have emerged from the research such as transactional tagging visualisation and linked data visualisation. The information visualisation model and system is extremely useful in addressing complex data problems with strategies that are easy to interact with and integrate.
|
307 |
ASPCAP: THE APOGEE STELLAR PARAMETER AND CHEMICAL ABUNDANCES PIPELINEGarcía Pérez, Ana E., Prieto, Carlos Allende, Holtzman, Jon A., Shetrone, Matthew, Mészáros, Szabolcs, Bizyaev, Dmitry, Carrera, Ricardo, Cunha, Katia, García-Hernández, D. A., Johnson, Jennifer A., Majewski, Steven R., Nidever, David L., Schiavon, Ricardo P., Shane, Neville, Smith, Verne V., Sobeck, Jennifer, Troup, Nicholas, Zamora, Olga, Weinberg, David H., Bovy, Jo, Eisenstein, Daniel J., Feuillet, Diane, Frinchaboy, Peter M., Hayden, Michael R., Hearty, Fred R., Nguyen, Duy C., O’Connell, Robert W., Pinsonneault, Marc H., Wilson, John C., Zasowski, Gail 23 May 2016 (has links)
The Apache Point Observatory Galactic Evolution Experiment (APOGEE) has built the largest moderately high-resolution (R approximate to 22,500) spectroscopic map of the stars across the Milky Way, and including dust-obscured areas. The APOGEE Stellar Parameter and Chemical Abundances Pipeline (ASPCAP) is the software developed for the automated analysis of these spectra. ASPCAP determines atmospheric parameters and chemical abundances from observed spectra by comparing observed spectra to libraries of theoretical spectra, using. 2 minimization in a multidimensional parameter space. The package consists of a FORTRAN90 code that does the actual minimization and a wrapper IDL code for book-keeping and data handling. This paper explains in detail the ASPCAP components and functionality, and presents results from a number of tests designed to check its performance. ASPCAP provides stellar effective temperatures, surface gravities, and metallicities precise to 2%, 0.1 dex, and 0.05 dex, respectively, for most APOGEE stars, which are predominantly giants. It also provides abundances for up to 15 chemical elements with various levels of precision, typically under 0.1 dex. The final data release (DR12) of the Sloan Digital Sky Survey III contains an APOGEE database of more than 150,000 stars. ASPCAP development continues in the SDSS-IV APOGEE-2 survey.
|
308 |
Implementing a Class of Permutation Tests: The coin PackageHothorn, Torsten, Hornik, Kurt, van de Wiel, Mark A., Zeileis, Achim January 2007 (has links) (PDF)
The R package coin implements a unified approach to permutation tests providing a huge class of independence tests for nominal, ordered, numeric, and censored data as well as multivariate data at mixed scales. Based on a rich and flexible conceptual framework that embeds different permutation test procedures into a common theory, a computational framework is established in coin that likewise embeds the corresponding R functionality in a common S4 class structure with associated generic functions. As a consequence, the computational tools in coin inherit the flexibility of the underlying theory and conditional inference functions for important special cases can be set up easily. Conditional versions of classical tests - such as tests for location and scale problems in two or more samples, independence in two- or three-way contingency tables, or association problems for censored, ordered categorical or multivariate data - can be easily be implemented as special cases using this computational toolbox by choosing appropriate transformations of the observations. The paper gives a detailed exposition of both the internal structure of the package and the provided user interfaces. / Series: Research Report Series / Department of Statistics and Mathematics
|
309 |
Determinants of Foreign Direct Investment: A panel data analysis of the MINT countriesGöstas Escobar, Alexandra, Fanbasten, Niko January 2016 (has links)
One of the most visible signs of the globalization of the world economy is the increase of Foreign Direct Investment (FDI) inflows across countries. This past decade the trend of FDI has shifted from developed countries to emerging economies, which is most notably in the BRICS countries. However, as BRICS reputation has been damaged these past years due to its weak growth outlook in the early 2010s, investors are shifting to the new economic grouping acronym, the MINT (Mexico, Indonesia, Nigeria and Turkey) countries for better future prospects of FDI destination. Since the MINT countries have emerged as a popular destination of FDI, it is necessary to investigate what are the key factors that make these four countries attractive as FDI destinations. Hence, this paper analyzes what are the determinants of inward FDI into the MINT countries during the time period from 1990 to 2014. To be able to answer the research question and demonstrate the effect of the seven independent variables (market size, economic instability, natural resources availability, infrastructure facilities, trade openness, institutional stability and political stability) on FDI as a dependent variable, the study uses a panel data analysis. The data is based on secondary data, which is collected from the World Bank dataset. The empirical finding from the study illustrates that market size, economic instability, infrastructure facilities, trade openness, institutional stability, and political stability are significant as determinants FDI inflows to the MINT countries, meanwhile, natural resources availability appears to be an insignificant determinant of FDI inflows to the MINT countries.
|
310 |
Computational Models of Nuclear ProliferationFrankenstein, William 01 May 2016 (has links)
This thesis utilizes social influence theory and computational tools to examine the disparate impact of positive and negative ties in nuclear weapons proliferation. The thesis is broadly in two sections: a simulation section, which focuses on government stakeholders, and a large-scale data analysis section, which focuses on the public and domestic actor stakeholders. In the simulation section, it demonstrates that the nonproliferation norm is an emergent behavior from political alliance and hostility networks, and that alliances play a role in current day nuclear proliferation. This model is robust and contains second-order effects of extended hostility and alliance relations. In the large-scale data analysis section, the thesis demonstrates the role that context plays in sentiment evaluation and highlights how Twitter collection can provide useful input to policy processes. It first highlights the results of an on-campus study where users demonstrated that context plays a role in sentiment assessment. Then, in an analysis of a Twitter dataset of over 7.5 million messages, it assesses the role of ‘noise’ and biases in online data collection. In a deep dive analyzing the Iranian nuclear agreement, we demonstrate that the middle east is not facing a nuclear arms race, and show that there is a structural hole in online discussion surrounding nuclear proliferation. By combining both approaches, policy analysts have a complete and generalizable set of computational tools to assess and analyze disparate stakeholder roles in nuclear proliferation.
|
Page generated in 0.0644 seconds