91 |
Problems and Possibilities with Non-Empirical Assessment of Scientific Theories : An Analysis of the Argument Given by Richard Dawid / Problem och möjligheter med icke-empirisk bedömning av vetenskapliga teorier : En analys av Richard Dawids argumentSkott, Anton January 2020 (has links)
This essay examines the argument given by Richard Dawid (2013, 2019) for the viability of non-empirical assessment of scientific theories. Dawid's argument is supposed to show that trust in a scientific theory can be justified without any direct empirical testing of the theory. This view is fundamentally different from what will be called the classical paradigm of theory assessment. The classical paradigm holds that only empirical testing can justify belief in a theory. It is argued in this essay that Dawid's argument does not provide sufficient reasons for claiming that non-empirical assessment can be seen as a valid form of justification of scientific theories. However, it is further argued that non-empirical assessment still can play an important role when evaluating the status of a theory that cannot yet be tested empirically.
|
92 |
Combining empirical mode decomposition with neural networks for the prediction of exchange rates / Jacques MoutonMouton, Jacques January 2014 (has links)
The foreign exchange market is one of the largest and most active financial markets with enormous daily trading volumes. Exchange rates are influenced by the interactions of a large number of agents, each operating with different intentions and on different time scales. This gives rise to nonlinear and non-stationary behaviour which complicates modelling. This research proposes a neural network based model trained on data filtered with a novel Empirical Mode Decomposition (EMD) filtering method for the forecasting of exchange rates.
One minor and two major exchange rates are evaluated in this study. Firstly the ideal prediction horizons for trading are calculated for each of the exchange rates. The data is filtered according to this ideal prediction horizon using the EMD-filter. This EMD-filter dynamically filters the data based on the apparent number of intrinsic modes in the signal that can contribute towards prediction over the selected horizon. The filter is employed to filter out high frequency noise and components that would not contribute to the prediction of the exchange rate at the chosen timescale. This results in a clearer signal that still includes nonlinear behaviour. An artificial neural network predictor is trained on the filtered data using different sampling rates that are compatible with the cut-off frequency. The neural network is able to capture the nonlinear relationships between historic and future filtered data with greater certainty compared to a neural network trained on unfiltered data.
Results show that the neural network trained on EMD-filtered data is significantly more accurate at prediction of exchange rates compared to the benchmark models of a neural network trained on unfiltered data and a random walk model for all the exchange rates. The EMD-filtered neural network’s predicted returns for the higher sample rates show higher correlations with the actual returns, and significant profits can be made when applying a trading strategy based on the predictions. Lower sample rates that just marginally satisfy the Nyquist criterion perform comparably with the neural network trained on unfiltered data; this may indicate that some aliasing occurs for these sampling rates as the EMD low-pass filter has a gradual cut-off, leaving some high frequency noise within the signal.
The proposed model of the neural network trained on EMD-filtered data was able to uncover systematic relationships between the filtered inputs and actual outputs. The model is able to deliver profitable average monthly returns for most of the tested sampling rates and forecast horizons of the different exchange rates. This provides evidence that systematic predictable behaviour is present within exchange rates, and that this systematic behaviour can be modelled if it is properly separated from high frequency noise. / MIng (Computer and Electronic Engineering), North-West University, Potchefstroom Campus, 2015
|
93 |
Combining empirical mode decomposition with neural networks for the prediction of exchange rates / Jacques MoutonMouton, Jacques January 2014 (has links)
The foreign exchange market is one of the largest and most active financial markets with enormous daily trading volumes. Exchange rates are influenced by the interactions of a large number of agents, each operating with different intentions and on different time scales. This gives rise to nonlinear and non-stationary behaviour which complicates modelling. This research proposes a neural network based model trained on data filtered with a novel Empirical Mode Decomposition (EMD) filtering method for the forecasting of exchange rates.
One minor and two major exchange rates are evaluated in this study. Firstly the ideal prediction horizons for trading are calculated for each of the exchange rates. The data is filtered according to this ideal prediction horizon using the EMD-filter. This EMD-filter dynamically filters the data based on the apparent number of intrinsic modes in the signal that can contribute towards prediction over the selected horizon. The filter is employed to filter out high frequency noise and components that would not contribute to the prediction of the exchange rate at the chosen timescale. This results in a clearer signal that still includes nonlinear behaviour. An artificial neural network predictor is trained on the filtered data using different sampling rates that are compatible with the cut-off frequency. The neural network is able to capture the nonlinear relationships between historic and future filtered data with greater certainty compared to a neural network trained on unfiltered data.
Results show that the neural network trained on EMD-filtered data is significantly more accurate at prediction of exchange rates compared to the benchmark models of a neural network trained on unfiltered data and a random walk model for all the exchange rates. The EMD-filtered neural network’s predicted returns for the higher sample rates show higher correlations with the actual returns, and significant profits can be made when applying a trading strategy based on the predictions. Lower sample rates that just marginally satisfy the Nyquist criterion perform comparably with the neural network trained on unfiltered data; this may indicate that some aliasing occurs for these sampling rates as the EMD low-pass filter has a gradual cut-off, leaving some high frequency noise within the signal.
The proposed model of the neural network trained on EMD-filtered data was able to uncover systematic relationships between the filtered inputs and actual outputs. The model is able to deliver profitable average monthly returns for most of the tested sampling rates and forecast horizons of the different exchange rates. This provides evidence that systematic predictable behaviour is present within exchange rates, and that this systematic behaviour can be modelled if it is properly separated from high frequency noise. / MIng (Computer and Electronic Engineering), North-West University, Potchefstroom Campus, 2015
|
94 |
A semi-empirical approach to modelling well deliverability in gas condensate reservoirsUgwu, Johnson Obunwa January 2011 (has links)
A critical issue in the development of gas condensate reservoirs is accurate prediction of well deliverability. In this investigation a procedure has been developed for accurate prediction of well production rates using semi-empirical approach. The use of state of the art fine grid numerical simulation is time consuming and computationally demanding, therefore not suitable for real time rapid production management decisions required on site. Development of accurate fit-for-purpose correlations for fluid property prediction below the saturation pressure was a major consideration to properly allow for retrograde condensation, complications of multiphase flow and mobility issues. Previous works are limited to use of experimentally measured pressure, volume, temperature (PVT) property data, together with static relative permeability correlations for simulation of well deliverability. To overcome the above limitations appropriate fluid property correlations required for prediction of well deliverability and dynamic three phase relative permeability correlation have been developed to enable forecasting of these properties at all the desired reservoir conditions The developed correlations include; condensate hybrid compressibility factor, viscosity, density, compositional pseudo-pressure, and dynamic three phase relative permeability. The study made use of published data bases of experimentally measured gas condensate PVT properties and three phase relative permeability data. The developed correlations have been implemented in both vertical and horizontal well models and parametric studies have been performed to determine the critical parameters that control productivity in gas condensate reservoirs, using specific case studies. The improved correlations showed superior performance over existing correlations on validation. The investigation has built on relevant literature to present an approach that modifies the black oil model for accurate well deliverability prediction for condensate reservoirs at conditions normally ignored by the conventional approach. The original contribution to knowledge and practice includes (i) the improved property correlations equations, (4.44, 4.47, 4.66, 4.69, 4.75, 5.21) and (ii) extension of gas rate equations, for condensate rate prediction in both vertical and horizontal wells. Standard industry software, the Eclipse compositional model, E-300 has been used to validate the procedure. The results show higher well performance compared with the industry standard. The new procedure is able to model well deliverability with limited PVT and rock property data which is not possible with most available methods. It also makes possible evaluation of various enhanced hydrocarbon recovery techniques and optimisation of gas condensate recovery.
|
95 |
Advances in empirical similitude methodTadepalli, Srikanth 02 November 2009 (has links)
Dimensional Analysis is a technique that has allowed engineering evaluation
of complex objects by scaling analysis results of representative simpler
models. The original premise of the procedure stems from the idea of developing
non-dimensional parameters to relate physical events and underlying
analytical basis. Extending the process to incorporate non-linear and time
variant behavior has led to development of a novel process of similitude called
the Empirical Similitude Method (ESM) where experimental data of test specimen
is combined to produce the required prediction values.
Using the original motivation and hypothesis of ESM, this research has expanded the experimental similitude process by using adapted matrix
representations and continuous functional mapping of test results. This new
approach has provided more rigorous mathematical definitions for similarity
and prediction estimations based on an innovative error minimization algorithm.
Shape factors are also introduced and integrated into ESM to obtain
comprehensive evaluation of specimen choices.
A detailed overview is provided summarizing methods, principles and
laws of traditional similitude (TSM) and systems that satisfy extension into
ESM. Applicability of ESM in different systems is described based on the limitations
of TSM in the evaluation of complex structures. Several examples
and ideas spanning aerodynamic, thermal, mechanical and electro-magnetic
domains are illustrated to complement inherent technical analysis. For example,
the new ESM procedure is shown to be considerably more accurate than
earlier methods in predicting the values of drag coefficient of an airfoil. A final
foray into the regime of \design evaluation by similarity" is made to elucidate
applicability and efficiency of developed techniques in practical systems and
products. A thorough methodology is also presented highlighting pertinent
procedures and processes in usage of this method. / text
|
96 |
Empirical Studies of Mobile Apps and Their Dependence on Mobile PlatformsSyer, MARK 24 January 2013 (has links)
Our increasing reliance on mobile devices has given rise to a new class of software applications (i.e., mobile apps). Tens of thousands of developers have developed hundreds of thousands of mobile apps that are available across multiple platforms. These apps are used by millions of people around the world every day. However, most software engineering research has been performed on large desktop or server applications.
We believe that research efforts must begin to examine mobile apps. Mobile apps are rapidly growing, yet they differ from traditionally-studied desktop/server applications.
In this thesis, we examine such apps by performing three quantitative studies. First, we study differences in the size of the code bases and development teams of desktop/server applications and mobile apps. We then study differences in the code, dependency and churn properties of mobile apps from two different mobile platforms. Finally, we study the impact of size, coupling, cohesion and code reuse on the quality of mobile apps.
Some of the most notable findings are that mobile apps are much smaller than traditionally-studied desktop/server applications and that most mobile apps tend to be developed by only one or two developers. Mobile app developers tend to rely heavily on functionality provided by the underlying mobile platform through platform-specific APIs. We find that Android app developers tend to rely on the Android platform more than BlackBerry app developers rely on the BlackBerry platform. We also find that defects in Android apps tend to be concentrated in a small number of files and that files that depend on the Android platform tend to have more defects.
Our results indicate that major differences exist between mobile apps and traditionally-studied desktop/server applications. However, the mobile apps of two different mobile platforms also differ. Further, our results suggest that mobile app developers should avoid excessive platform dependencies and focus their testing efforts on source code files that rely heavily on the underlying mobile platform. Given the widespread use of mobile apps and the lack of research surrounding these apps, we believe that our results will have significant impact on software engineering research. / Thesis (Master, Computing) -- Queen's University, 2013-01-24 10:15:56.086
|
97 |
Comparing Five Empirical Biodata Scoring Methods for Personnel SelectionRamsay, Mark J. 08 1900 (has links)
A biodata based personnel selection measure was created to improve the retention rate of Catalog Telemarketing Representatives at a major U.S. retail company. Five separate empirical biodata scoring methods were compared to examine their usefulness in predicting retention and reducing adverse impact. The Mean Standardized Criterion Method, the Option Criterion Correlation Method, Horizontal Percentage Method, Vertical Percentage Method, and Weighted Application Blank Method using England's (1971) Assigned Weights were employed. The study showed that when using generalizable biodata items, all methods, except the Weighted Application Blank Method, were similar in their ability to discriminate between low and high retention employees and produced similar low adverse impact effects. The Weighted Application Blank Method did not discriminate between the low and high retention employees.
|
98 |
Release management in free and open source software ecosystemsPoo-Caamaño, Germán 02 December 2016 (has links)
Releasing software is challenging. To decide when to release software, developers may
consider a deadline, a set of features or quality attributes. Yet, there are many stories of
software that is not released on time. In large-scale software development, release management
requires significant communication and coordination. It is particularly challenging
in Free and Open Source Software (FOSS) ecosystems, in which hundreds of loosely connected
developers and their projects are coordinated for releasing software according to a
schedule.
In this work, we investigate the release management process in two large-scale FOSS
development projects. In particular, our focus is the communication in the whole release
management process in each ecosystem across multiple releases. The main research questions
addressed in this dissertation are: (1) How do developers in these FOSS ecosystems
communicate and coordinate to build and release a common product based on different
projects? (2) What are the release management tasks in a FOSS ecosystem? and (3) What
are the challenges that release managers face in a FOSS ecosystem?
To understand this process and its challenges better, we used a multiple case study
methodology, and colleced evidence from a combination of the following sources: documents,
archival records, interviews, direct observation, participant observation, and physical
artifacts. We conducted the case studies on two FLOSS software ecosystems: GNOME
and OpenStack. We analyzed over two and half years of communication in each ecosystem
and studied developers’ interactions. GNOME is a collection of libraries, system services,
and end-user applications; together, these projects provide a unified desktop —the GNOME
desktop. OpenStack is a collection of software tools for building and managing cloud computing
platforms for public and private clouds. We catalogued communication channels,
categorized coordination activities in one channel, and triangulated our results by
interviewing key developers identified through social network analysis.
We found factors that impact the release process in a software ecosystem, including a
release schedule positively, influence instead of direct control, and diversity. The release
schedule drives most of the communication within an ecosystem. To achieve a concerted release,
a Release Team helps developers reach technical consensus through influence rather
than direct control. The diverse composition of the Release Team might increase its reach
and influence in the ecosystem. Our results can help organizations build better large-scale
teams and show that software engineering research focused on individual projects might
miss important parts of the picture.
The contributions of this dissertation are: (1) an empirical study of release management
in two FOSS ecosystems (2) a set of lessons learned from the case studies, and (3) a theory
of release management in FOSS ecosystems. We summarize our theory that explains our
understanding of release management in FOSS ecosystems as three statements: (1) the size
and complexity of the integrated product is constrained by the release managers capacity,
(2) release management should be capable of reaching the whole ecosystem, and (3) the
release managers need social and technical skills. The dissertation discusses this theory in
the light of the case studies, other research efforts, and its implications. / Graduate / 0984 / gpoo+proquest@calcifer.org
|
99 |
Three Essays in Financial EconomicsJulio, Ivan F. 06 August 2013 (has links)
No description available.
|
100 |
Analýza pracovní spokojenosti ve společnosti TOMOS, a.s. / Job satisfaction analysis in the company TOMOS, a.s.Sumová, Lucie January 2010 (has links)
The presented diploma thesis is a practical research of job satisfaction in the chosen company Tomos Praha, a.s. It deals with the theoretical background of job satisfaction, its relation to the motivation and its impact on the performance of the employees. It describes particular theories of job satisfaction and focuses on the factors which influence the job satisfaction. In the practical part it examines the job satisfaction in the company Tomos Praha, a.s. from the different point of view, analyses the the data acquired by the empirical research and proposes the steps for improving the current situation.
|
Page generated in 0.0623 seconds