• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 90
  • 32
  • 23
  • 11
  • 11
  • 8
  • 4
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 225
  • 136
  • 47
  • 34
  • 27
  • 27
  • 25
  • 20
  • 18
  • 16
  • 15
  • 14
  • 13
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

High Performance by Exploiting Information Locality through Reverse Computing / Hautes Performances en Exploitant la Localité de l'Information via le Calcul Réversible.

Bahi, Mouad 21 December 2011 (has links)
Les trois principales ressources du calcul sont le temps, l'espace et l'énergie, les minimiser constitue un des défis les plus importants de la recherche de la performance des processeurs.Dans cette thèse, nous nous intéressons à un quatrième facteur qui est l'information. L'information a un impact direct sur ces trois facteurs, et nous montrons comment elle contribue ainsi à l'optimisation des performances. Landauer a montré que c’est la destruction - logique - d’information qui coûte de l’énergie, ceci est un résultat fondamental de la thermodynamique en physique. Sous cette hypothèse, un calcul ne consommant pas d’énergie est donc un calcul qui ne détruit pas d’information. On peut toujours retrouver les valeurs d’origine et intermédiaires à tout moment du calcul, le calcul est réversible. L'information peut être portée non seulement par une donnée mais aussi par le processus et les données d’entrée qui la génèrent. Quand un calcul est réversible, on peut aussi retrouver une information au moyen de données déjà calculées et du calcul inverse. Donc, le calcul réversible améliore la localité de l'information. La thèse développe ces idées dans deux directions. Dans la première partie, partant d'un calcul, donné sous forme de DAG (graphe dirigé acyclique), nous définissons la notion de « garbage » comme étant la taille mémoire – le nombre de registres - supplémentaire nécessaire pour rendre ce calcul réversible. Nous proposons un allocateur réversible de registres, et nous montrons empiriquement que le garbage est au maximum la moitié du nombre de noeuds du graphe.La deuxième partie consiste à appliquer cette approche au compromis entre le recalcul (direct ou inverse) et le stockage dans le contexte des supercalculateurs que sont les récents coprocesseurs vectoriels et parallèles, cartes graphiques (GPU, Graphics Processing Unit), processeur Cell d’IBM, etc., où le fossé entre temps d’accès à la mémoire et temps de calcul ne fait que s'aggraver. Nous montons comment le recalcul en général, et le recalcul inverse en particulier, permettent de minimiser la demande en registres et par suite la pression sur la mémoire. Cette démarche conduit également à augmenter significativement le parallélisme d’instructions (Cell BE), et le parallélisme de threads sur un multicore avec mémoire et/ou banc de registres partagés (GPU), dans lequel le nombre de threads dépend de manière importante du nombre de registres utilisés par un thread. Ainsi, l’ajout d’instructions du fait du calcul inverse pour la rematérialisation de certaines variables est largement compensé par le gain en parallélisme. Nos expérimentations sur le code de Lattice QCD porté sur un GPU Nvidia montrent un gain de performances atteignant 11%. / The main resources for computation are time, space and energy. Reducing them is the main challenge in the field of processor performance.In this thesis, we are interested in a fourth factor which is information. Information has an important and direct impact on these three resources. We show how it contributes to performance optimization. Landauer has suggested that independently on the hardware where computation is run information erasure generates dissipated energy. This is a fundamental result of thermodynamics in physics. Therefore, under this hypothesis, only reversible computations where no information is ever lost, are likely to be thermodynamically adiabatic and do not dissipate power. Reversibility means that data can always be retrieved from any point of the program. Information may be carried not only by the data but also by the process and input data that generate it. When a computation is reversible, information can also be retrieved from other already computed data and reverse computation. Hence reversible computing improves information locality.This thesis develops these ideas in two directions. In the first part, we address the issue of making a computation DAG (directed acyclic graph) reversible in terms of spatial complexity. We define energetic garbage as the additional number of registers needed for the reversible computation with respect to the original computation. We propose a reversible register allocator and we show empirically that the garbage size is never more than 50% of the DAG size. In the second part, we apply this approach to the trade-off between recomputing (direct or reverse) and storage in the context of supercomputers such as the recent vector and parallel coprocessors, graphical processing units (GPUs), IBM Cell processor, etc., where the gap between processor cycle time and memory access time is increasing. We show that recomputing in general and reverse computing in particular helps reduce register requirements and memory pressure. This approach of reverse rematerialization also contributes to the increase of instruction-level parallelism (Cell) and thread-level parallelism in multicore processors with shared register/memory file (GPU). On the latter architecture, the number of registers required by the kernel limits the number of running threads and affects performance. Reverse rematerialization generates additional instructions but their cost can be hidden by the parallelism gain. Experiments on the highly memory demanding Lattice QCD simulation code on Nvidia GPU show a performance gain up to 11%.
212

Work stress, work-home interference, and organisational culture of insurance employees in Zimbabwe

Mudzimu, Peggy Tapiwa Vimbai 08 1900 (has links)
The research revolves on the emergence of globalisation, change, competition, work pressure, and risks among others which have exposed insurance employees to work stress that can interfere with home activities. The research purpose was to determine the relationship between work stress, work-home interference, and organisational culture among insurance employees in the Zimbabwean context. The sample consisted of 240 participants, from which data was collected from 190 employees who responded to the questionnaires. The questionnaires were analysed using SPSS, internal consistency reliability analysis, and the inter-correlation analysis. The inferential statistics used were multiple linear regression and one way ANOVA. Substantial positive and negative correlations were noted for the six sub-scales of the Occupational roles questionnaire (ORQ), negative work-home interference (NWHI) and positive work-home interference (PWHI) scales, and the three sub-scales of the Organisational culture index (OCI). The research concluded that different measures should be taken to manage work stressors, depending on the organisational culture, and its employees to prevent spill-over which contributes to negative work-home interference. / Industrial and Organisational Psychology / M. Com. (Industrial and Organisational Psychology)
213

Stanovení charakteristik spreje pomocí optických měřících metod / Measurement of spray characteristics using optical measurement methods

Ďurdina, Lukáš January 2012 (has links)
Diplomová práce se zabývá měřením charakteristik sprejů dvou tlakových vířivých trysek pro spalovací komoru malého turbínového motoru na zkušebním stavu za studena pomocí metod Particle Image Velocimetry (PIV) a fázové Dopplerovské anemometrie (PDA). Cílem měření bylo stanovit a porovnat charakteristiky sprejů obou trysek. Výsledky měření mají objasnit rozdílnost chování trysek za provozu a možný dopad na proces spalování. Úvodní teoretická část pojednává o základních fyzikálních principech atomizace kapalin, konstrukci a oblasti uplatnění tlakových vířivých trysek a o principech laserových diagnostických metod použitých při experimentálním měření. Nasledující část popisuje návrh a montáž zkušební trati a dalších zařízení navržených pro experimentální měření v této práci. Experimentální část se zabývá nastavením parametrů měřícího systému a zpracováním dat. Výsledky měření zahrnují vektorová rychlostní pole, axiální rychlostní profily a distribuce velikosti kapek pro různé provozní podmínky obou trysek.
214

Undocumented oil leakages : A study about stern tube seals and leakages / Odokumenterade oljeläcklage : En studie om propellerhylstätningar och läckage

Lundberg, Johan January 2021 (has links)
The majority of the vessels in the commercial fleet utilize oil lubricated stern tubes. Unfortunately, this brings about a risk of oil leaking from the stern tube into the marine environment if the stern tube seal would become worn or damaged by foreign materials. Previous studies concluded that, on average, 2.6 litres of oil per day leak out from the stern tube of ships. This essay has investigated the causes that could increase the leakage rate from the stern tube by reviewing literature, interviewing experts, and sending out surveys with questions regarding the subject. The answers that were received painted a clear picture that it is impossible to get a perfect seal on a stern tube. The causes that could influence the leakage rate were design related, such as vibrations, the rotational speed of the propeller shaft, radial and axial movements of the propeller shaft, as well as external causes such as the quality of water and foreign materials, for example, fishing lines and nets. The question whether water lubricated stern tubes were a viable alternative compared to oil lubricated stern tubes was also investigated. The result was that the bearing on a water lubricated stern tube did not have as long lifespan as an oil lubricated bearing. / Majoriteten av fartygen i handelsflottan använder oljesmorda propellerhylsor för att smörja propelleraxeln. Tyvärr medför det en risk att olja kan läcka ut från hylsan om tätningarna skulle bli utslitna eller skadas av främmande föremål. Tidigare forskning har visat att cirka 2,6 liter olja per dag läcker ut genomsnittligen från fartyg i handelsflottan. Denna uppsats har utrett vad som kan påverka läckagemängden från en propellerhylsa genom att utnyttja intervjuer och enkätsvar. Svaren som framkom var att det är mer eller mindre omöjligt att få en perfekt tätning på en propellerhylsa. Saker som kan påverka läckagemängden var designfenomen som vibrationer, axiella- och radiella rörelser och varvtal på propelleraxeln, samt yttre påverkan som vattenkvalité och främmande föremål som fiskelinor och nät. I uppsatsen jämförs även vattensmorda propellerhylsor med fokus på livstid, kostnad och underhåll, med en oljesmord propellerhylsa. En slutsats är att det vattensmorda lagret inte har samma livstid som det oljesmorda.
215

Modélisation et mesure de l’interaction d’une onde électromagnétique avec une surface océanique. Application à la détection et à la caractérisation radar de films d’hydrocarbures. / Electromagnetic Wave Scattering Modeling and Measurement from Ocean Surfaces. Detection and Characterization of an Oil Film.

Mainvis, Aymeric 05 December 2018 (has links)
Les instruments, satellites ou systèmes aéroportés, actuellement utilisés pour la détection et la caractérisation d'hydrocarbure sur la mer sont basés sur des moyens optiques ou radars. Ces moyens présentent une performance dégradée due à une fréquence encore trop importante de fausses alarmes ou à un temps de traitement des données trop conséquent. Les méthodes de détection, d'identification et de quantification des fuites d'hydrocarbures offshores peuvent donc être améliorées en associant robustesse et réactivité. Cette amélioration suppose une compréhension approfondie des phénomènes océanographiques et électromagnétiques à l'œuvre dans cette scène particulière. La thèse s'appuie sur des données regroupant des images optiques et SAR aéroportées ou satellites ainsi que des mesures réalisées en laboratoire. Ce jeu de données permet de vérifier la cohérence des résultats obtenus par modélisation. L'objectif de la thèse est de distinguer une surface de mer polluée d'une surface de mer propre à l'aide de la signature électromagnétique de la surface totale puis de détailler le type et la quantité d'hydrocarbure présent. La thèse se divise en deux domaines, à savoir modélisation océanographique et modélisation électromagnétique. La modélisation océanographique intègre la simulation de la surface rugueuse imitant une surface de mer propre, et polluée. Cette surface de mer doit être générée sur une superficie importante et doit conserver une résolution restituant les petites vagues avec un temps de génération minimal. La partie électromagnétique est centrée sur les modèles asymptotiques de diffusion des ondes électromagnétiques par une interface rugueuse. Ces modèles sont adaptés au contexte de la thèse, complexité de la scène et rapidité du traitement, mais nécessitent plusieurs hypothèses pour être appliqués. / Satellites or airborne systems currently used for the detection and characterization of oil slicks on sea surface are based on optical or radar means. These means have a lack of performance due to a too high frequency of false alarms or to an excessively long data processing time. The methods for detecting, identifying and quantifying offshore pollutant can therefore be improved by combining robustness and reactivity. This improvement implies an in-depth understanding of the oceanographic and electromagnetic phenomena at work in this particular scene. The thesis is based on data gathering aerial and satellite images and SAR as well as measurements carried out in laboratory. This dataset makes it possible to check the consistency of the results obtained by modeling. The objective of the thesis is to distinguish a polluted sea surface from a clean sea surface using the electromagnetic signature of the total surface and then to detail the type and quantity of pollutant. The thesis is divided into two domains, namely oceanographic modeling and electromagnetic modeling. Oceanographic modeling integrates the simulation of the rough surface imitating a clean or polluted sea surface. This sea surface must be generated over a large area with a thin resolution. The electromagnetic part is centered on the asymptotic models for the electromagnetic waves diffraction by a rough interface. These models are adapted to the context of the thesis, the complexity of the scene and the speed of processing, but require several hypotheses to be applied.
216

Genre criticism : an application of BP's image restoration campaign to the crisis communication genre

Eastlick, Anne C. 01 January 2011 (has links)
Within two months of its emergence, the BP Gulf Oil spill had become the worst environmental disaster in United States history. However, for those studying public relations the oil spill brought more than ecological disaster, by providing a case study of crisis communication. Although there were a number of crisis responses from BP throughout the course of the oil spill, the primary crisis response crafted by BP was an image restoration campaign which premiered in early June 2010. This campaign, though it exhibits qualities of a standard crisis response, was wildly unpopular with the United States Government and citizenry. This rhetorical analysis attempts to uncover the reasons behind the campaign's failure through an application of the genre model of criticism. By defining the crisis communication genre and applying it to the artifact, the current study uncovers the reasons behind the failure of the campaign. Through this discussion, this analysis identifies that BP did not address all necessary exigencies, nor did it consider the influence a rhetor can have on a message. An explanation for the failure of BP' s campaign provided a plethora of implications to the fields of public . relations and rhetorical criticism, while beginning a discussion to help define the crisis communication genre.
217

The fate and distribution of subsurface hydrocarbons released during the 2010 MC252 oil spill in deep offshore waters of the Gulf of Mexico

Spier, Chelsea L. 01 January 2012 (has links)
The explosion of the Deepwater Horizon oil platform on April 20, 2010 resulted in the second largest oil spill in history. In this study, the distribution and chemical composition of hydrocarbons within a 45 km radius of the blowout was investigated. A complete set of hydrocarbon data were acquired from the National Oceanic and Atmospheric Administration (NOAA) and from BP, including data from 16 research missions collected over eight weeks. The distribution of hydrocarbons was found to be more dispersed over a wider area in subsurface waters than previously predicted or reported. Several hydrocarbon plumes were identified including a near-surface plume (0.5 to 50 m), two small mid-depth plume (240 to 290 m and 850 to 880 m), and a large deepwater plume approximately 1,050 to 1,300 m below surface. Water soluble compounds were preferentially extracted from the rising oil in deepwater, and were found at potentially toxic levels both in and outside of areas previously reported to contain the majority of hydrocarbons. Data collected from different research missions were measured for a wide variety of chemical compounds, but not every sample was analyzed for the same chemical compounds. To overcome the challenge of variability in sample data, a non-parametric method of evaluating the percentage of detectable results, was used for all data analysis in addition to evaluation of total sample concentrations. The two analysis techniques yielded similar results. This approach may be useful in other studies in which samples are measured for varying number of compounds and have varying detection limits. The distribution and toxicity of hydrocarbons in sediments between August and October, 2010 was also investigated and was found to be fairly localized.
218

The Spillable Environment: Expanding a Handheld Device's Screen Real Estate and Interactive Capabilities

Clement, Jeffrey S. 07 August 2007 (has links) (PDF)
Handheld devices have a limited amount of screen real estate. If a handheld device could take advantage of larger screens, it would create a more powerful user interface and environment. As time progresses, Moore's law predicts that the computational power of handheld devices will increase dramatically in the future, promoting the interaction with a larger screen. Users can then use their peripheral vision to recognize spatial relationships between objects and solve problems more easily with this integrated system. In the spillable environment, the handheld device uses a DiamondTouch Table, a large, touch-sensitive horizontal table, to enhance the viewing environment. When the user moves the handheld device on the DiamondTouch, the orientation of the application changes accordingly. A user can let another person see the application by rotating the handheld device in that person's direction. A user could conveniently use this system in a public area. In a business meeting, a user can easily show documents and presentations to other users around the DiamondTouch table. In an academic setting, a tutor could easily explain a concept to a student. A user could effortlessly do all of this while having all of his/her information on the handheld device. A wide range of applications could be used in these types of settings.
219

Comparing Media Coverage Of The Gulf Oil Spill In The Us And Uk Implications For Global Crisis Communication

Crytzer, Sarah 01 January 2011 (has links)
The following research is a content analysis of 114 articles written by the American and British news media outlets in the first month following the BP Gulf oil spill in April 2010. The goal of the research was to identify any dominant frames evident in the reports and to compare the two countries to see if there was a difference in the dominant frames used. Positive, negative, and neutral tones were also evaluated to determine if there was a difference between the countries. The results show that both countries reports predominantly used an ecology and action frame, while British media outlets also used an economic frame. Both countries reported with primarily a negative and neutral tone. The implications of these findings for crisis communication managers are discussed.
220

Biomineralization of atrazine and analysis of 16S rRNA and catabolic genes of atrazine-degraders in a former pesticide mixing and machinery washing area at a farm site and in a constructed wetland

Douglass, James F. January 2015 (has links)
No description available.

Page generated in 0.0452 seconds