• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 151
  • 45
  • 21
  • 13
  • 12
  • 10
  • 7
  • 6
  • 6
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 346
  • 29
  • 25
  • 23
  • 21
  • 21
  • 21
  • 21
  • 20
  • 19
  • 19
  • 18
  • 18
  • 18
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

A descriptive case study of the assimilation of Berean Mission Inc. into UFM international

Talley, John D., January 2001 (has links)
Thesis (D. Min.)--Dallas Theological Seminary, 2001. / Includes abstract. Includes bibliographical references (leaves 120-1124).
32

A descriptive case study of the assimilation of Berean Mission Inc. into UFM international

Talley, John D., January 2001 (has links) (PDF)
Thesis (D. Min.)--Dallas Theological Seminary, 2001. / Includes abstract. Includes bibliographical references (leaves 120-1124).
33

Entwicklungsorganisation und -prozesse für Multibrandstrategien an ausgewählten Beispielen

Loewens, Sofia. January 2005 (has links) (PDF)
Bachelor-Arbeit Univ. St. Gallen, 2005.
34

Une analyse des pratiques tarifaires de la société en Commandite Gaz Métropolitain /

Pouliot, Sébastien. January 2003 (has links)
Thèse (de maîtrise)--Université Laval, 2003. / Bibliogr.: f. 82-83. Publié aussi en version électronique.
35

Évolution des processus de négociation collective dans deux usines québécoises de la papeterie Stone-Consolidated entre 1990 et 1995 /

Vincent, Claude. January 2002 (has links)
Thèse (maîtrise)--Université Laval, 2002. / Bibliogr.: f. [104]-105. Publié aussi en version électronique.
36

Corporate video, the use of television at Air Products and Chemicals, inc. a case study /

McHale, Katherine. January 1986 (has links)
Thesis (M.S.)--Kutztown University. / Source: Masters Abstracts International, Volume: 45-06, page: 2715. Typescript. Includes bibliographical references (leaves 64-66).
37

Improving automated layout techniques for the production of schematic diagrams

Chivers, Daniel January 2014 (has links)
This thesis explores techniques for the automated production of schematic diagrams, in particular those in the style of metro maps. Metro map style schematics are used across the world, typically to depict public transport networks, and therefore benefit from an innate level of user familiarity not found with most other data visualisation styles. Currently, this style of schematic is used infrequently due to the difficulties involved with creating an effective layout – there are no software tools to aid with the positioning of nodes and other features, resulting in schematics being produced by hand at great expense of time and effort. Automated schematic layout has been an active area of research for the past decade, and part of our work extends upon an effective current technique – multi-criteria hill climbing. We have implemented additional layout criteria and clustering techniques, as well as performance optimisations to improve the final results. Additionally, we ran a series of layouts whilst varying algorithm parameters in an attempt to identify patterns specific to map characteristics. This layout algorithm has been implemented into a custom-written piece of software running on the Android operating system. The software is targeted at tablet devices, using their touch-sensitive screens with a gesture recognition system to allow users to construct complex schematics using sequences of simple gestures. Following on from this, we present our work on a modified force-directed layout method capable of producing fast, high-quality, angular schematic layouts. Our method produces superior results to the previous octilinear force-directed layout method, and is capable of producing results comparable to many of the much slower current approaches. Using our force-directed layout method we then implemented a novel mental map preservation technique which aims to preserve node proximity relations during optimisation; we believe this approach provides a number of benefits over the the more common method of preserving absolute node positions. Finally, we performed a user study on our method to test the effect of varying levels of mental map preservation on diagram comprehension.
38

Detecting salient information using RSVP and the P3 : computational and EEG explorations

Alsufyani, Abdulmajeed January 2015 (has links)
This thesis investigates the efficacy of employing the Rapid Serial Visual Presentation (RSVP) technique for stimulus presentation in brain activity-based deception detection tests. One reason for using RSVP is to present stimuli on the fringe of awareness (e.g. 10 per second), making it more difficult for a guilty person to confound the test by use of countermeasures. It is hypothesized that such a rapid presentation rate prevents the vast majority of RSVP stimuli from being perceived at a level sufficient for encoding into working memory, but that salient stimuli will break through into consciousness and be encoded. Such ‘breakthrough’ perceptual events are correlated with a P300 Event Related Potential (ERP) component that can be used as an index of perceiving/processing a salient stimulus (e.g. crime-relevant information). On this basis, a method is proposed for detecting salience based on RSVP and the P300, which will be referred to as the Fringe-P3 method. The thesis then demonstrates how the Fringe-P3 method can be specialized for application to deception detection. Specifically, the proposed method was tested in an identity deception study, in which participants were instructed to lie about (i.e. conceal) their own-name. As will be shown, experimental findings demonstrated a very high hit rate in terms of detecting deceivers and a low false alarm rate in misdetecting non-deceivers. Most significantly, a review of these findings confirms that the Fringe-P3 identity detector is resilient against countermeasures. The effectiveness of the Fringe-P3 method in detecting stimuli of lower salience (i.e. famous names) than own-name stimuli was then evaluated. In addition, the question of whether faces can be used in an ERP-based RSVP paradigm to infer recognition of familiar faces was also investigated. The experimental results showed that the method is effective in distinguishing broadly familiar stimuli as salient, resulting in the generation of a detectable P300 component on a per-individual basis. These findings support the applicability of the proposed method to forensic science (e.g. detecting knowledge of criminal colleagues). Finally, an ERP assessment method is proposed for performing per-individual statistical inferences in deception detection tests. By analogy with functional localizers in fMRI, this method can be viewed as a form of functional profiling. The method was evaluated on EEG data sets obtained by use of the Fringe-P3 technique. Additionally, simulated data were used to explore how the method’s performance varies with parametric manipulation of the signal-to-noise ratio (SNR). As will be demonstrated, experimental findings confirm that the proposed method is effective for detecting the P300, even in ERPs with low SNR.
39

Application of dynamic factor modelling to financial contagion

Sakaria, Dhirendra Kumar January 2016 (has links)
Contagion has been described as the spread of idiosyncratic shocks from one market to another in times of financial turmoil. In this work, contagion has been modelled using a global factor to capture the general market movements and idiosyncratic shocks are used to capture co-movements and volatility spill-over between markets. Many previous studies have used pre-specified turmoil and calm periods to understand when contagion occurs. We introduce time-varying parameters which model the volatility spillover from one country to another. This approach avoids the need to pre-specify particular types of periods using external information. Efficient Bayesian inference can be made using the Kalman filter in a forward filtering and backward sampling algorithm. The model is applied to market indices for Greece and Spain to understand the effect of contagion during the European sovereign debt crisis 2007-2013 (Euro crisis) and examine the volatility spillover between Greece and Spain. Similarly, the volatility spillover from Hong Kong to Singapore during the Asian financial crisis 1997-1998 has also been studied. After a review of the research work in the financial contagion area and of the definitions used, we have specified a model based on the work by Dungey et al. (2005) and include a world factor. Time varying parameters are introduced and Bayesian inference and MCMC simulations are used to estimate the parameters. This is followed by work using the Normal Mixture model based on the paper by Kim et al. (1998) where we realised that the volatility parameters results depended on the value of the ‘mixture offset’ parameter. We propose method to overcome the problem of setting the parameter value. In the final chapter, a stochastic volatility model with with heavy tails for the innovations in the volatility spillover is used and results from simulated cases and the market data for the Asian financial crisis and Euro crisis are summarised. Briefly, the Asian financial crisis periods are identified clearly and agree with results in other published work. For the Euro crisis, the periods of volatility spillover (or financial contagion) are identified too, but for smaller periods of time. We conclude with a summary and outline of further work.
40

Provenance-aware CXXR

Silles, Christopher Anthony January 2014 (has links)
A provenance-aware computer system is one that records information about the operations it performs on data to enable it to provide an account of the process that led to a particular item of data. These systems allow users to ask questions of data, such as “What was the sequence of steps involved in its creation?”, “What other items of data were used to create it?”, or “What items of data used it during their creation?”. This work will present a study of how, and the extent to which the CXXR statistical programming software can be made aware of the provenance of the data on which it operates. CXXR is a variant of the R programming language and environment, which is an open source implementation of S. Interestingly S is notable for becoming an early pioneer of provenance-aware computing in 1988. Examples of adapting software such as CXXR for provenance-awareness are few and far between, and the idiosyncrasies of an interpreter such as CXXR—moreover the R language itself—present interesting challenges to provenance-awareness: such as receiving input from a variety of sources and complex evaluation mechanisms. Herein presented are designs for capturing and querying provenance information in such an environment, along with serialisation facilities to preserve data together with its provenance so that they may be distributed and/or subsequently restored to a CXXR session. Also presented is a method for enabling this serialised provenance information to be interoperable with other provenance-aware software. This work also looks at the movement towards making research reproducible, and considers that provenance-aware systems, and provenance-aware CXXR in particular, are well positioned to further the goal of making computational research reproducible.

Page generated in 0.0316 seconds