• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1451
  • 532
  • 294
  • 170
  • 155
  • 116
  • 48
  • 44
  • 43
  • 29
  • 26
  • 20
  • 20
  • 20
  • 20
  • Tagged with
  • 3621
  • 632
  • 513
  • 483
  • 389
  • 378
  • 364
  • 314
  • 293
  • 290
  • 239
  • 239
  • 239
  • 228
  • 216
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
611

Physical and computational applications of strongly-interacting dynamics beyond QCD

Bennett, Edward January 2013 (has links)
In this thesis we investigate numerically SU(2) theories with Dirac—or Majorana—fermions in the adjoint representation. Majorana fermions have historically proven difficult to treat numerically; here, a change of basis is introduced that allows two Majorana fermions to be expressed in terms of one Dirac fermion. This also provides greater insight into the analysis of the properties of theories with Dirac fermions. Attention is focused on the SU(2) theory with a single Dirac flavour (or equivalently two Majorana flavours). Its lattice phase diagram, spectrum, and the anomalous dimension of the chiral condensate are investigated. We observe a long region of constant mass ratios and an anomalous dimension 0.9 ≲ γ∗ ≲ 0.95. The behaviour of the pion mass and the presence of a light scalar in particular point to behaviour that is not traditionally confining; instead the theory appears to lie in or near the conformal window. The topological susceptibility and instanton size distribution are also investigated, for the one-Dirac-flavour theory and additionally the pure-gauge and two-Dirac-flavour (Minimal Walking Technicolor) theories. The properties are found to not depend on number of flavours, indicating a quenching of the fermions in the topology, also consistent with (near-)conformal behaviour (as has previously been reported in studies of other observables for Minimal Walking Technicolor). The code used is described, and a high-performance computing benchmark developed from it is detailed. While the benchmark was originally developed to investigate the performance of different supercomputer architectures for the class of problems we are interested in. Due to the nature of the code on which it is based, it has an unusual flexibility in the demands it may place on machine’s performance characteristics, which may allow it to be applicable to problems outside of lattice physics. The benchmark is used to characterise a number of machines’ relative performance.
612

Consumer response to computerised nutritional information at the point-of-purchase in catering establishments

Balfour, Donna S. January 1994 (has links)
Increased scientific understanding of the links between nutrition and health has led to a demand for more nutrition information to be made available to consumers. Nutrition information is widely available on supermarket products but is rarely found in catering establishments. This research involved the provision of nutrition information in canteens and restaurants and studied the effect on consumer meal choices. A study was designed to find the optimum visual method of displaying nutrition information. Eight nutritional formats were systematically tested on customers in a shopping centre food court. Graphical formats displaying nutrition information in relation to current dietary advice relayed the nutrition information significantly quicker than, and as accurately as, tabular displays. A database system was developed to provide nutrition information on menu items making up a selected meal. A program suite was designed to enable the creation of recipes and menus. The nutritional breakdown of a selected meal was displayed to the customer who was then given the opportunity to change their meal before that meal was acquired. All initial choices and subsequent changes were recorded for analysis. Surveys carried out in two canteen locations (n=694) revealed that a significant percentage of customers (16%) did make changes to their meal after viewing the nutritional information on their first choice. Those who did not change were, on average, making "healthy" choices of meal. Those who did change made second choices which were, on average, significantly lower in energy, saturated fatty acids and non-milk extrinsic sugars than their first selections. Overall "healthier" choices were made with the second selection which did not differ significantly from the nutritional content of the meals chosen by those respondents who had not wished to change. Further research is necessary to determine whether the intention to change a selected meal as demonstrated by this research would be carried through by the respondents to the actual food selection.
613

Penetration Testing of Web Applications in a Bug Bounty Program

Schulz, Pascal January 2014 (has links)
Web applications provide the basis for the use of the "World-Wide-Web", as people know itnowadays. These software solutions get programmed by a numerous amount of developersall over the world. For all this software, it is not possible to guarantee a 100 percent security.Therefore, it is desirable that every application should get evaluated using penetration tests.Anewformof security testing platforms is getting provided by bug bounty programs, whichencourage the community to help searching for security breaches. This work introduces thecurrently leading portal for bug bounties, called Bugcrowd Inc. In addition, web applications,which were part of the program, got tested in order to evaluate their security level.A comparison is made with given statistics by leading penetration testing companies, showingthe average web application security level. The submission process, to send informationabout vulnerabilities, is getting evaluated. The average time it takes, to receive an answer regardinga submission is getting reviewed. In the end, the findings get retested, to evaluate, ifthe bug bounty program is a useful opportunity to increase security and if website operatorstake submissions serious by patching the software flaws.
614

Laser surface alloying of copper with Ni-based hardfacing alloys for enhancing hardness and corrosion resistance

Kam, Weng Seng January 2017 (has links)
University of Macau / Faculty of Science and Technology / Department of Electromechanical Engineering
615

An embedded object approach to embedded system development

Vallius, T. (Tero) 27 October 2009 (has links)
Abstract Building an embedded system from an idea to a product is a slow and expensive process requiring a lot of expertise. Depending on the developer’s expertise, the required quantity and price level of the final product, and the time and money available for development, the developer can build a device from different granularity of components, ranging from ready-made platforms, kits, and modules to individual components. Generally, solutions requiring less expertise, time and money produce products with higher production costs. The main contribution of this thesis is the EOC (Embedded Object Concept) and Atomi II Framework. EOC utilizes common object-oriented methods used in software by applying them to small electronic modules, which create complete functional entities. The conceptual idea of the embedded objects is implemented with the Atomi II framework, which contains several techniques for making the EOC a commercially feasible implementation. The EOC and the Atomi II Framework decreases the difficulty level of making embedded systems by enabling a use of ready-made modules to build systems. It enables automatic conversion of a device made from such modules into an integrated PCB, lowering production costs compared to other modular approaches. Furthermore, it also enables an automatic production tester generation due to its modularity. These properties lower the number of skills required for building an embedded system and quicken the path from an idea to a commercially applicable device. A developer can also build custom modules of his own if he possesses the required expertise. The test cases demonstrate the Atomi II Framework techniques in real world applications, and demonstrate the capabilities of Atomi objects. According to our test cases and estimations, an Atomi based device becomes approximately 10% more expensive than a device built from individual components, but saves up to 50% time, making it feasible to manufacture up to 10-50k quantities with this approach.
616

Characterisation and control of the zinc roasting process

Nyberg, J. (Jens) 07 December 2004 (has links)
Abstract Increasing efficiency is a necessary target for an industrial roaster nowadays. This thesis presents some studies on efficiency improvement in the zinc roasting process - process characterisation, control design, implementation and testing. The thesis focuses on the roaster, i.e. on research regarding the phenomena in the roaster furnace. By learning more about the roasting mechanism, particle size growth and dynamics of the furnace, new control implementations have been developed. More measurements, analyses and calculated variables have been added to give more information on the state of the furnace. New control variables have been introduced to give the operators more opportunities to set the conditions so that they are more suitable for the actual concentrate feed mixture. Equipment modifications have also been done. In this research, both laboratory and plant experiments have been performed together with thermodynamic evaluations and calculations. It has been necessary to make plant trials in order to obtain information about the impacts of different variables on the process. Only full-scale experiments give reliable results of the behaviour of an industrial furnace. The experiments with the roaster furnace have emphasised the study of both the metallurgy and the dynamics of the roasting process. The on-line calculated oxygen coefficient and its active control have proved important. The particle size distribution analysis of the furnace calcine has been shown to be a significant source of information for evaluating the state of the roasting furnace. The main target is to improve the economic performance. The key is to be able to be flexible in using different kinds of raw materials, because the main income is the treatment charge. The trend is that concentrates are becoming finer, which increases the challenges for roaster furnace control. The capability to use low-grade concentrates is also a major challenge and improves the economic result. Research and development on the boiler and mercury removal has also been part of this work for many reasons. Improved boiler performance and mercury removal gives more freedom in choosing concentrates and operating the roaster furnace. The approach has been the same as in the roaster furnace research and development work. Control improvements based on existing knowledge, such as fuzzy control systems for controlling the furnace temperature and mercury removal, did stabilize the process, but they did not solve all the problems regarding process stability. The research and development concept of this thesis has provided the extra knowledge needed for further improvement of process control. The results of the process characterisation have led to the implementation of a new and effective control strategy. The research and development carried out has improved performance in a number of ways: increased running time of the furnace and boiler, in-depth knowledge of roasting phenomena which led to new control methods and instructions for the operators, improved quality of sulphuric acid and a method to control its quality, measurements and analyses that give valuable information of the state of the process – all of which are now in use. In the future, the emphasis will be placed on the research and development of roaster furnace performance, which will be a great challenge. Control of the roaster furnace is the key to the economic success of the roasting process and more information about these phenomena is needed for improving and optimising control.
617

Bis (trialkoxysilyl) telechelic polymer materials for adhesive applications / Polymère téléchélique bis (trialcoxysilyle) pour les applications adhésives

Ma, Xiaolu 28 September 2016 (has links)
Les travaux portent sur la synthèse des (co)polyoléfines bis(trialcoxysilyle) téléchéliques, liquides à température ambiante, pour des applications adhésives. La première approche est consacrée à la combinaison de la polymérisation par ouverture de cycle par métathèse (ROMP) et de la métathèse croisée (CM) d'une cyclooléfine ou d'un mélange de cyclooléfines en présence d'une oléfine trialcoxysilyle monofonctionnelle ou difonctionnelle agissant comme agent de transfert (CTA) et d'un catalyseur à base de ruthénium. Il est montré que l'efficacité de la réaction et la sélectivité / fonctionnalité des polymères dépendent notamment de la nature du solvant, du CTA, du catalyseur, et de l'utilisation (ou pas) de benzoquinone comme additif inhibiteur de l'isomérisation. Une très grande productivité catalytique (turnover number, TON, jusque 100 000) a été obtenue avec les conditions optimisées. La viscosité du copolymère a été contrôlée par ajustement de la nature et du ratio des co-monomères. La deuxième approche est consacrée à la dépolymérisation du polybutadiène (PBD) liquide à haute teneur en 1,4-cis en présence d'un CTA et d'un catalyseur au ruthénium. L'efficacité et la sélectivité de la réaction ont été optimisées en variant la méthode de la purification du PBD commercial, la nature du catalyseur et le protocole opératoire. Cette approche est néanmoins moins efficace que la première. / The work presented focuses on the synthesis of liquid (at room temperature) bis(trialkoxysilyl) telechelic polyolefins for adhesive applications. The first approach relies on the combined ring-opening metathesis polymerization/cross metathesis (ROMP/CM) of a cycloolefin or a mixture of cycloolefins using a trialkoxysilyl mono- or difunctionalized alkene acting as a chain transfer agent (CTA) and a ruthenium-based catalyst. The efficiency of the reaction and selectivity of the polymer functionality were found to depend much on the nature of the CTA, the catalyst, the solvent and the use of benzoquinone additive as isomerization inhibitor. A high catalytic productivity with a turnover number (TON) up to 100 000 was obtained under optimized conditions. The viscosity of polymers was controlled by adjusting the nature and the ratio of comonomers. The second approach is dedicated to the depolymerization of liquid high 1,4-cis polybutadiene (PBD) in the presence of a CTA and a ruthenium catalyst. The catalytic productivity and selectivity were optimized by changing the method of purification of the commercial PBD, the nature of catalyst and the reaction protocol. This second approach remains, however, less efficient than the first one.
618

Measurements and applications of radon in South African aquifer and river waters

Abdalla, Siddig Abdalla Talha January 2009 (has links)
Philosophiae Doctor - PhD / In the natural decay series of 238U an inert radioactive gas, 222Rn (radon) is formed in the decay of 226Ra. Because radon is relatively soluble in water, it migrates from places of its generation in rocks and soils to other places either by soil air, or travels with underground water. Therefore, there is a growing interest among hydrogeologists in using radon as a natural tracer for investigating and managing fresh water reservoirs. This work is aimed at investigating and developing radon-in-water measuring techniques applicable to aquifers and rivers. A gamma-ray spectrometry method using a hyper-pure germanium (HPGe) detector, based at iThemba LABS, Cape Town and Marinelli beakers, has been optimized to measure radon in borehole water via the g-rays associated with the decay of radon daughters 214Pb and 214Bi (in secular equilibrium with their parent). An accuracy better than 5% was achieved. Moreover, long-term measurements of radon in water from an iThemba LABS borehole have been carried out to investigate the role of radon for characterizing aquifers. These investigations led to the development of a simplified physical model that reproduces the time-evolution of radon concentration with borehole pumping and may be used to estimate the time for representative sampling of the aquifer. A novel method is also proposed in this thesis to measure radon-in-water in the field after grab sampling - a so-called quasi in-situ method. The quasi in-situ method involves inserting a y-ray detector in a container of large volume filled with water of interest. The g-ray spectra are analyzed using an approach involving energy intervals on the high-energy part of the spectrum (1.3 – 3.0 MeV). Each energy interval corresponds to contributions from one of the major g-ray sources: 40K and the decay series of 238U and 232Th, and cosmic rays. It is assumed that the U interval will be dominated by g-rays emitted from the radon daughters (214Pb and 214Bi). Minor contributions to an interval with major radionuclide are corrected using an MCNPX simulated standard spectra. The two methods in this thesis make a significant contribution to measuring and modelling of radon in aquifers and surface waters. It forms a basis for further development in an interactive mode with hydrological applications. / South Africa
619

Processing hidden Markov models using recurrent neural networks for biological applications

Rallabandi, Pavan Kumar January 2013 (has links)
Philosophiae Doctor - PhD / In this thesis, we present a novel hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov Models (HMMs). Though sequence recognition problems could be potentially modelled through well trained HMMs, they could not provide a reasonable solution to the complicated recognition problems. In contrast, the ability of RNNs to recognize the complex sequence recognition problems is known to be exceptionally good. It should be noted that in the past, methods for applying HMMs into RNNs have been developed by other researchers. However, to the best of our knowledge, no algorithm for processing HMMs through learning has been given. Taking advantage of the structural similarities of the architectural dynamics of the RNNs and HMMs, in this work we analyze the combination of these two systems into the hybrid architecture. To this end, the main objective of this study is to improve the sequence recognition/classi_cation performance by applying a hybrid neural/symbolic approach. In particular, trained HMMs are used as the initial symbolic domain theory and directly encoded into appropriate RNN architecture, meaning that the prior knowledge is processed through the training of RNNs. Proposed algorithm is then implemented on sample test beds and other real time biological applications.
620

Discrete pulse transform of images and applications

Fabris-Rotelli, Inger Nicolette 02 May 2013 (has links)
The LULU operators Ln and Un operate on neighbourhoods of size n. The Discrete Pulse Transform (DPT) of images is obtained via recursive peeling of so-called local maximum and minimum sets with the LULU operators as n increases from 1 to the maximum number of elements in the array. The DPT provides a new nonlinear decomposition of a multidimensional array. This thesis investigates the theoretical and practical soundness of the decomposition for image analysis. Properties for the theoretical justification of the DPT are provided as consistency of the decomposition (a pseudo-linear property), and its setting as a nonlinear scale-space, namely the LULU scalespace. A formal axiomatic theory for scale-space operators and scale-spaces is also presented. The practical soundness of the DPT is investigated in image sharpening, best approximation of an image, noise removal in signals and images, feature point detection with ideas to extending work to object tracking in videos, and image segmentation. LULU theory on multidimensional arrays and the DPT is now at a point where concrete signal, image and video analysis algorithms can be developed for a wide variety of applications. / Thesis (PhD)--University of Pretoria, 2013. / Mathematics and Applied Mathematics / unrestricted

Page generated in 0.1008 seconds