• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1175
  • 514
  • 326
  • 61
  • 61
  • 61
  • 61
  • 61
  • 60
  • 51
  • 40
  • 27
  • 26
  • 17
  • 16
  • Tagged with
  • 2671
  • 2671
  • 501
  • 410
  • 329
  • 297
  • 260
  • 257
  • 207
  • 191
  • 188
  • 187
  • 180
  • 150
  • 141
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

The development of an improved coded-pulse, vertical-incidence ionosonde

Cretchley, Brian Brind January 1979 (has links)
This thesis describes the theoretical development of a new ionospheric sounding system. The different types of ionosonde, their prime objectives, and their relative merits and demerits are discussed. The various types of code and their correlation functions are described. The essential requirements of the new system are listed, and suitable codes are found for it. Computer calculations and mathematical derivations demonstrate the (theoretical) suitability of these codes under all conditions. Essentials of the mode of operation of the system and details of its design are specified, and computer simulations are used to examine relevant aspects of its operation. Finally, since the construction of the system is not complete and results cannot therefore be presented, the present state of construction of the system is described
242

Computer modelling of the thermal decomposition of solids

De la Croix, Annemarie January 1996 (has links)
Decompositions of solids are typically of the form: A(s) ----> B(s) + gases. Symmetry-controlled routes (based on known and hypothetical crystal structures) for transforming the solid reactant into the solid product were devised as possible decomposition pathways. Lattice energies of the reactants, of the postulated transient intermediate structures and of the final solid products were then estimated by crystal modelling procedures. Profiles of lattice energy changes during the proposed decomposition routes were constructed and any energy barriers were compared with experimental activation energies reported for the thermal decompositions. The crystal modelling was performed with the computer program WMIN. Calculation of the lattice energies involved the development of a model potential for the perfect lattice and the evaluation of the interatomic parameters. The potential was based on the Born model of ionic solids using the Buckingham potential (Ø(r)= Ae⁻r/p - C/r⁶) to describe the short-range energy contribution. Empirical fitting was used to establish reliable interatomic energy parameters. The reliability of the interatomic potentials was assessed by calculating crystal structures and lattice energies (which were not included in the fitting). The particular reactions selected for modelling were the decompositions of the alkaline-earth metal (Ca, Sr, Ba) peroxides and carbonates: M0₂(s) ---> MO(s) + ¹/₂0₂(g) MC0₃(s) ---> MO(s) + CO₂(g)The lattice energies calculated for the known structures were in good agreement with reported values, (except for Ba0₂ and BaC0₃) which provided support for the adequacy of the potential model used. Activation energies calculated for the decomposition of the carbonates were in the correct order but hlgher than experimental values, i. e., 422, 422, 465 and 499 kJ mol̄̄⁻¹ compared to the experimental values of 205, 87(?), 222 and 283 kJ mol̄̄⁻¹ for CaC0₃ (calcite), CaC0₃(aragonite), SrC0₃ and BaC0₃. The values calculated for the peroxides (91 and 100 kJ mol⁻¹ compared to the experimental values of 119 and 185 kJ mol⁻¹ for Sr0₂ and Ba0₂ respectively) were less satisfactory but could be a reflection of the poor structural data used for the peroxides. The significance of this approach to the modelling of solid decompositions is discussed.
243

Parallel implementation of a virtual reality system on a transputer architecture

Bangay, Shaun Douglas 11 October 2012 (has links)
A Virtual Reality is a computer model of an environment, actual or imagined, presented to a user in as realistic a fashion as possible. Stereo goggles may be used to provide the user with a view of the modelled environment from within the environment, while a data-glove is used to interact with the environment. To simulate reality on a computer, the machine has to produce realistic images rapidly. Such a requirement usually necessitates expensive equipment. This thesis presents an implementation of a virtual reality system on a transputer architecture. The system is general, and is intended to provide support for the development of various virtual environments. The three main components of the system are the output device drivers, the input device drivers, and the virtual world kernel. This last component is responsible for the simulation of the virtual world. The rendering system is described in detail. Various methods for implementing the components of the graphics pipeline are discussed. These are then generalised to make use of the facilities provided by the transputer processor for parallel processing. A number of different decomposition techniques are implemented and compared. The emphasis in this section is on the speed at which the world can be rendered, and the interaction latency involved. In the best case, where almost linear speedup is obtained, a world containing over 250 polygons is rendered at 32 frames/second. The bandwidth of the transputer links is the major factor limiting speedup. A description is given of an input device driver which makes use of a powerglove. Techniques for overcoming the limitations of this device, and for interacting with the virtual world, are discussed. The virtual world kernel is designed to make extensive use of the parallel processing facilities provided by transputers. It is capable of providing support for mUltiple worlds concurrently, and for multiple users interacting with these worlds. Two applications are described that were successfully implemented using this system. The design of the system is compared with other recently developed virtual reality systems. Features that are common or advantageous in each of the systems are discussed. The system described in this thesis compares favourably, particularly in its use of parallel processors. / KMBT_223
244

An evaluation of the validity and reliability of chemical exposure assessment tools used under the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) Regulation 2006

Lamb, Judith January 2016 (has links)
Within the European Union, the Registration, Evaluation, Authorisation and restriction of Chemicals (REACH) Regulation 2006, requires the registration of chemicals by manufacturers and suppliers by a number of deadlines from 2008- 2018. Registrants must prepare a dossier of substance information, incorporating a chemical safety report and exposure assessment for the work activities for which the substance is used. Several computer-based screening (Tier 1) tools mentioned in the REACH guidance are used by Registrants to assess exposure: the ECETOC Targeted Risk Assessment (TRA); STOFFENMANAGER®; the EMKG-EXPO-TOOL and MEASE. As screening tools, they are expected to generate overestimates of exposure, thus allowing identification of situations of concern requiring more detailed assessment. The study aimed to carry out a comprehensive evaluation of the tools' validity, by comparison of their predictions with workplace measurements of inhalation exposure and their reliability, through assessment of consistency of operation by different users. Comparison of tool predictions of exposure with measurement data (n=3941) from a wide range of European and North American sources suggested that the tools generated overestimates in the majority of cases, with exceptions noted in certain process types and substance forms. Extremely large variability was observed in tool outputs generated by a group of representative users (n=146) when assessing identical exposure situations. A small number of input parameters drove most of the variation; the task descriptor, risk management measures and type of setting. Further calibration and review of the tool predictions are required to ensure that they generate consistent overestimates of exposure. Systems should be implemented to reduce between-user variability, for example user certification; round-robin exercises and team-based assessments. The study findings will assist in increasing the consistency of use and validity of the tools, thus increasing the overall levels of worker health protection and minimising unnecessary business risk management expenditure on redundant exposure controls.
245

An electronic simulation of the human handwriting system

McDonald, Robert Glen January 1970 (has links)
A simple model of an antagonistic muscle system is developed based on several published physiological observations and is found to be a linear first order approximation to mammalian muscle. The model response is compared to that of the human hand in various tests including impulse response, frequency response, step response and the effect of sliding friction. The results of these tests are used to select the mechanical parameters of the proposed model. The final model simulates many of the observed responses of the human hand when executing motions similar to handwriting. A control scheme is proposed for use with the mechanical model and an electronic simulation of the whole system is conducted using a digital and an analogue computer. Good matches of displacement and acceleration waveforms from human handwriting were produced by the simulation. A discussion of some physiological evidence supporting the proposed control scheme is given. / Applied Science, Faculty of / Electrical and Computer Engineering, Department of / Graduate
246

Computer simulation of microvascular exchange after thermal injury

Gu, Xiaozheng January 1987 (has links)
A computer model is developed to study the fluid and protein redistribution after thermal injuries in rats. This model is derived by including the burned skin as a fourth compartment in the microvascular exchange model developed by Bert et al. [6]. The pathological changes that occur after thermal injuries are introduced into the burn model as perturbations. The simulations of short-term and long-term responses were then made in this four compartment (burn) model for two cases: 10% and 40% percent surface area burns. Appropriate ranges of the perturbations were estimated based on the available information in the literature. The perturbations for the 10% burn include: the plasma leak coefficient in the injured skin, the tissue pressure in the injured skin, the fluid exchange coefficients in the injured skin, the arterial capillary pressure in the injured skin and the lymph flow characteristics in the injured skin. The perturbations for the 40% burn include the perturbations for the 10% burn plus the plasma leak coefficients in the intact tissues, the fluid exchange coefficients in the intact tissues and the lymph flow characteristics in the intact tissues. The dynamic responses of the system using these perturbations were plotted. Comparisons between the simulation predictions and the experimental data were characterized in terms of sum-of-squares of differences between simulation results and experimental data. Compared to the limited amount of data available in the literature, the burn model describes microvascular exchange after thermal injuries reasonably well. The work in this thesis could easily be extended to account for fluid resuscitation following a thermal injury in rats and, it is hoped that this approach might eventually be applied to the resuscitation management of burn patients. / Applied Science, Faculty of / Chemical and Biological Engineering, Department of / Graduate
247

Modelling default-risky bonds

Magwegwe, Frank Mashoko 22 July 2005 (has links)
In this dissertation, we examine current models used to value default-risky bonds. These models include both the structural and the reduced-form approaches. We begin by examining various issues involved in modelling credit risk and pricing credit derivatives. We then explore the various dimensions of structural models and reduced-form models and we provide an overview of four models presented in the literature on credit risk modelling. Both the theoretical and empirical research on default-risky bond valuation is summarized. Finally, we make suggestions for improving on the credit risk models discussed. / Dissertation (MSc (Mathematics of Finance))--University of Pretoria, 2006. / Mathematics and Applied Mathematics / unrestricted
248

An integrated complexity analysis of problems from computational biology

Hailett, Micheiel Trevor 17 July 2018 (has links)
We perform an Integrated complexity analysis on a number of combinatorial problems arising from the field of computational biology. The classic framework of NP-completeness, algorithmic design techniques for bounded width graphs, and parameterized computational complexity together provide a clear and detailed map of the intrinsic hardness of the following problems: INTERVALIZING COLORED GRAPHS and SHORTEST COMMON SUPERSEQUENCE. The fundamental concern of parameterized complexity is the apparent qualitative difference in algorithmic behaviour displayed by many problems when one or more input parameters are bounded. For many problems, only a small range of values for these parameters capture most instances arising in practice. This is certainly the case in computational biology in several specific arenas such as DNA physical mapping or multiple sequence alignment. At its most general level, parameterized complexity partitions problems into two classes: fixed parameter tractable (FPT) and fixed parameter intractable (hard for classes of the W-hierarchy.) The former indicates that the particular parameterization may allow for efficient practical algorithms whilst the latter indicates the parameterization is not effective (asymptotically) in alleviating the intractability. The problem INTERVALIZING COLORED GRAPHS (ICG) models in a straightforward albeit limited way the determination of contig assemblies in the mapping of DNA. We show ICG to be NP-complete (no polynomial time algorithm unless P=NP), not finite-state (a very general algorithmic design technique for bounded width graphs fails), and hard for the parameterized complexity class W[1] (a specific parameterized version of ICG does not admit an efficient algorithm unless many other well-known - and apparently hard - problems admit efficient algorithms). Both SHORTEST COMMON SUPERSEQUENCE and its sister problem LONGEST COMMON SUBSEQUENCE have applications in multiple sequence alignment. We show that SHORTEST COMMON SUPERSEQUENCE PARAMETERIZED BY THE NUMBER OF INPUT STRINGS AND THE SIZE OF THE ALPHABET is hard for complexity class W[1]. As is the case with ICG, this implies that it does not admit efficient algorithms unless some unlikely computational complexity collapses occur. / Graduate
249

Inferring disease transmission networks

Yang, Xiaofei 19 March 2014 (has links)
To investigate how an infectious disease spreads, it is desirable to use the observed surveil­lance data to discover the underlying (often hidden) disease transmission networks. Previous studies have provided methods for inferring information diffusion networks in which each node corresponds to an individual person within the diffusion network. However, in the case of disease transmission, to effectively propose and implement intervention strategies, it is more realistic and reasonable for policy makers to study the diffusion patterns at a metapop­ulation level, that is, to consider disease transmission networks in which nodes represent subpopulations, and links indicate their interrelationships. Such networks can be useful in several ways: (i) to investigate hidden impact factors that in.uence epidemic dynamics, (ii) to reveal possible sources of epidemic outbreaks, and (iii) to practically develop and/or improve strategies for controlling the spread of infectious diseases. Therefore, this thesis addresses the problem of inferring disease transmission networks at a metapopulation level. A network inference method called NetEpi (Network Epidemic) is developed and evaluated using both synthetic and real-world datasets. The experimental results show that NetEpi can recover most of the ground-truth disease transmission networks using only surveillance data.
250

Simulation of multiple closed loop milling circuits

Rabins, Richard 02 July 2015 (has links)
No description available.

Page generated in 0.1191 seconds