• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 333
  • 129
  • 63
  • 34
  • 33
  • 21
  • 15
  • 8
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 798
  • 90
  • 86
  • 76
  • 60
  • 52
  • 48
  • 48
  • 46
  • 45
  • 44
  • 44
  • 43
  • 42
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Reducing Energy Consumption through Optimization of the Operating Conditions of the Gas Trunk Pipeline

Albutov, Alexey January 2013 (has links)
Gas supplying process for consumers needs sufficient share of energy for upstream, midstream and downstream purposes. In spite of a huge amount of great investments into the industry it is still available to improve the efficiency of energy usage inside the industry. The biggest share of energy consumption is within transportation sector. Optimization of operating conditions of gas pipeline is a one of the cheapest ways for reducing energy consumption. Optimization doesn’t need any investments into the industry. It works only within operating parameters. Adjustable operating parameters of a gas pipeline are operative pressure, rotation speed of compressors, amount of operating units, gas temperature after a compressor station and others. The energy consumption depends on the combination of the parameters which determine an appropriate operation mode to provide the particular gas flow through a pipeline, the maximum capacity, the minimum energy consumption and others. From energy saving point of view it is possible to reduce energy demand in the gas industry due to optimization of the operation mode. A few approaches to achieving energy reduction through optimization are investigated in this work and presented in this article, such as saving energy through changing of loading between compressor stations, varying the depth of gas cooling and changing the loading of gas pumping units. The results of analyzing inside the study model reflect the possibility for improving efficiency of gas trunk pipelines.
52

An Investigation into Succession Planning Initiatives in Government : A Case of the Botswana National Archives and Records Service / Sylvia Siane

Siane, Sylvia January 2013 (has links)
This research is an investigation into Succession planning initiatives in government agencies in Botswana, a case of the Botswana National Archives and Records Services. The Botswana National Archives and Records Services has been experiencing a significant loss of employees over the years. When people leave the organisation in most cases the vacant position will take too long to fill, especially for those positions that require the technical qualification and experience in Archives and Records Management. This state of affairs according to the researcher may be addressed by implementing a succession planning strategy. Succession planning has the potential to uplift government agencies in terms of increased productivity, motivation, efficiency and retention of staff. Succession planning can help organisations to invest more on developing their staff for future key positions to ensure business continuity. It is important for organisations to have a strong pipeline so that key positions are easily filled as they become vacant. This study sought to establish if the Botswana National Archives and Records Services has any initiatives on succession planning and how well the employees understand the concept of succession planning. A questionnaire was designed and administered to the staff of the National Archives. The findings from the study reveal that most of the employees do not understand the concept of succession planning and the organisation has no initiative on succession planning. These findings led to the conclusion that the National Archives should include succession planning in its business strategy and teach its employees about it to ensure business continuity. / Thesis (MBA) North-West University, Mafikeng Campus, 2013
53

Profile-driven parallelisation of sequential programs

Tournavitis, Georgios January 2011 (has links)
Traditional parallelism detection in compilers is performed by means of static analysis and more specifically data and control dependence analysis. The information that is available at compile time, however, is inherently limited and therefore restricts the parallelisation opportunities. Furthermore, applications written in C – which represent the majority of today’s scientific, embedded and system software – utilise many lowlevel features and an intricate programming style that forces the compiler to even more conservative assumptions. Despite the numerous proposals to handle this uncertainty at compile time using speculative optimisation and parallelisation, the software industry still lacks any pragmatic approaches that extracts coarse-grain parallelism to exploit the multiple processing units of modern commodity hardware. This thesis introduces a novel approach for extracting and exploiting multiple forms of coarse-grain parallelism from sequential applications written in C. We utilise profiling information to overcome the limitations of static data and control-flow analysis enabling more aggressive parallelisation. Profiling is performed using an instrumentation scheme operating at the Intermediate Representation (Ir) level of the compiler. In contrast to existing approaches that depend on low-level binary tools and debugging information, Ir-profiling provides precise and direct correlation of profiling information back to the Ir structures of the compiler. Additionally, our approach is orthogonal to existing automatic parallelisation approaches and additional fine-grain parallelism may be exploited. We demonstrate the applicability and versatility of the proposed methodology using two studies that target different forms of parallelism. First, we focus on the exploitation of loop-level parallelism that is abundant in many scientific and embedded applications. We evaluate our parallelisation strategy against the Nas and Spec Fp benchmarks and two different multi-core platforms (a shared-memory Intel Xeon Smp and a heterogeneous distributed-memory Ibm Cell blade). Empirical evaluation shows that our approach not only yields significant improvements when compared with state-of- the-art parallelising compilers, but comes close to and sometimes exceeds the performance of manually parallelised codes. On average, our methodology achieves 96% of the performance of the hand-tuned parallel benchmarks on the Intel Xeon platform, and a significant speedup for the Cell platform. The second study, addresses the problem of partially sequential loops, typically found in implementations of multimedia codecs. We develop a more powerful whole-program representation based on the Program Dependence Graph (Pdg) that supports profiling, partitioning and codegeneration for pipeline parallelism. In addition we demonstrate how this enhances conventional pipeline parallelisation by incorporating support for multi-level loops and pipeline stage replication in a uniform and automatic way. Experimental results using a set of complex multimedia and stream processing benchmarks confirm the effectiveness of the proposed methodology that yields speedups up to 4.7 on a eight-core Intel Xeon machine.
54

Risk-based decision-making for the management of structural assets

Roberts, Caroline January 1999 (has links)
This thesis investigates the benefit of risk-based decision methods in engineering decisions. A thorough literature review identified the major issues and limitations in current methods. Consequently a more comprehensive model was developed to account for the complexities of real life decision-making. The enhancements introduced to the model include identifying and evaluating stakeholder influences, decision objectives, criteria and preferences between criteria and decision outcomes. Monitoring and controlling important parameters during implementation is also included to ensure objectives are met and risks controlled. Tools and techniques were identified to support decision-making within the new model. The research focuses on how available techniques can improve engineering decision-making. The model was applied to four case studies analysing real life, 'live' decision problems in bridge management and pipeline management. These confirmed the relevance and importance of the model enhancements. The practicality of the methods, their benefits and limitations were evaluated such that the proposed model was enhanced further. The enhanced model was shown to bring enhanced understanding to all four case studies and made the decisions more rational, thorough and auditable. The fifth case study reviewed how unsupported decisions are currently made within the sponsoring company. This involved a detailed desktop analysis of past projects and interviews with senior engineers and provided further evidence, which emphasised the value of using the decision model. General guidelines were developed based on the case study experiences to help the decision-maker identify the level of analysis required for different types of decision problems. These were defined as applicability matrices. The benefit of using a third party facilitator in each of the case studies was identified in terms of the roles of leader, liaison, disseminator, spokesman and disturbance handler. The balance between these five roles through the stages of the decision process was found to be important to ensure the facilitator does not dominate the decision.
55

Formation and Representation: Critical Analyses of Identity, Supply, and Demand in Science, Technology, Engineering, and Mathematics

Metcalf, Heather January 2011 (has links)
Considerable research, policy, and programmatic efforts have been dedicated to addressing the participation of particular populations in STEM for decades. Each of these efforts claims equity- related goals; yet, they heavily frame the problem, through pervasive STEM pipeline model discourse, in terms of national needs, workforce supply, and competitiveness. This particular framing of the problem may, indeed, be counter to equity goals, especially when paired with policy that largely relies on statistical significance and broad aggregation of data over exploring the identities and experiences of the populations targeted for equitable outcomes in that policy. In this study, I used the mixed-methods approach of critical discourse and critical quantitative analyses to understand how the pipeline model ideology has become embedded within academic discourse, research, and data surrounding STEM education and work and to provide alternatives for quantitative analysis. Using critical theory as a lens, I first conducted a critical discourse analysis of contemporary STEM workforce studies with a particular eye to pipeline ideology. Next, I used that analysis to inform logistic regression analyses of the 2006 SESTAT data. This quantitative analysis compared and contrasted different ways of thinking about identity and retention. Overall, the findings of this study show that many subjective choices are made in the construction of the large-scale datasets used to inform much national science and engineering policy and that these choices greatly influence likelihood of retention outcomes.
56

The self-burial of seabed pipelines

Paskin, Sandra January 1993 (has links)
No description available.
57

A study of the molecular organisation in structural PVDF

Glennon, Dermot January 1997 (has links)
No description available.
58

Plan de negocios para desarrollo de Spin-Off en servicios de integridad de tuberías

Escárate Gómez, José Carlos January 2015 (has links)
Autor no autoriza el acceso a texto completo de su documento hasta el 17/8/2020. / Magíster en Gestión y Dirección de Empresas / Se desarrolla el plan de negocios de Brass Chile S.A. para crear una spin-off dedicada a los servicios de Integridad de Tuberías . El sistema de transporte por tubería o pipeline es crítico puesto que tiene altas tasas de utilización, superior al 98% del año, y en muchos casos, prácticamente toda la producción de las compañías se transporta por este sistema. Aunque los pipelines tienen altas tasas de seguridad debido a que han sido diseñados y construidos de acuerdo a códigos internacionales y a las buenas prácticas de la ingeniería, esto no impide que igualmente puedan fallar. Por otro lado reducciones importantes en la producción de algunas compañías mineras han cambiado los lineamientos directivos, los que actualmente se orientan hacia la mejora de los niveles de seguridad y productividad. A través de la adaptación de metodologías existentes en la industria del petróleo y gas, los servicios de integridad que se presentan en este plan de negocios permiten disminuir la probabilidad de ocurrencia de una falla en los sistemas de impulsión. El valor que recibirá el cliente se genera a través de mayor confiabilidad en la operación, mayor eficiencia y eficacia de las operaciones y el aplazamiento del reemplazo de la tubería. El mercado potencial contempla todas las empresas que utilizan como medio de transporte un concentraducto, relaveducto, acueducto, gasoducto u oleoducto. Sin embargo, el target corresponde a las empresas de la gran minería con las que BRASS mantiene una sólida red de contactos. El servicio completo de integridad ha sido particionado en tres etapas secuenciales: auditorías que incluyen la recolección de datos y el diagnóstico; la medición y evaluación de la integridad; y apoyo a las reparaciones o rehabilitación del pipeline. Además, se consideran servicios complementarios de mantenimiento y entrenamiento de operadores. Los servicios implican el desarrollo de estudios y evaluación mediante software en las oficinas de BRASS. También se contempla la subcontratación de empresas especialistas en medición de espesores y construcción de pipeline. Considerando un pronóstico de ventas y una tasa de descuento de 20% anual, el VAN después de impuestos para los primeros 5 años es de 366 UF y la TIR resultante es de 20,8%. Se requiere una inversión inicial de 9.059 UF para desarrollo de software y promoción. El payback o período de retorno se produce al finalizar el quinto año. El punto de equilibrio resultante se produce cuando se alcanzan aproximadamente las 9.550 UF en ingresos netos vendidos, el cual se alcanzaría a partir del séptimo trimestre, antes de cumplir dos años de operaciones. La cantidad de UF vendidas tiene un equivalente en más de 5.000 HH de ingeniería. También se realiza un análisis de sensibilidad de Montecarlo considerando variaciones en los ingresos, precio de venta, costo de la ingeniería y tasa de interés de corto plazo. El análisis indica que la probabilidad de que el VAN después de impuestos resulte positivo es de 74%. Considerando la propuesta de valor, el análisis financiero positivo al termino del quinto año y la importante diversificación en los servicios de BRASS, se recomienda implementar la spin-off de servicios de integridad.
59

BacIL - En Bioinformatisk Pipeline för Analys av Bakterieisolat / BacIL - A Bioinformatic Pipeline for Analysis of Bacterial Isolates

Östlund, Emma January 2019 (has links)
Listeria monocytogenes and Campylobacter spp. are bacteria that sometimes can cause severe illness in humans. Both can be found as contaminants in food that has been produced, stored or prepared improperly, which is why it is important to ensure that the handling of food is done correctly. The National Food Agency (Livsmedelsverket) is the Swedish authority responsible for food safety. One important task is to, in collaboration with other authorities, track and prevent food-related disease outbreaks. For this purpose bacterial samples are regularly collected from border control, at food production facilities and retail as well as from suspected food items and drinking water during outbreaks, and epidemiological analyses are employed to determine the type of bacteria present and whether they can be linked to a common source. One part of these epidemiological analyses involve bioinformatic analyses of the bacterial DNA. This includes determination of sequence type and serotype, as well as calculations of similarities between samples. Such analyses require data processing in several different steps which are usually performed by a bioinformatician using different computer programs. Currently the National Food Agency outsources most of these analyses to other authorities and companies, and the purpose of this project was to develop a pipeline that would allow for these analyses to be performed in-house. The result was a pipeline named BacIL - Bacterial Identification and Linkage which has been developed to automatically perform sequence typing, serotyping and SNP-analysis of Listeria monocytogenes as well as sequence typing and SNP-analysis of Campylobacter jejuni, C. coli and C. lari. The result of the SNP-analysisis is used to create clusters which can be used to identify related samples. The pipeline decreases the number of programs that have to be manually started from more than ten to two.
60

Sistema de gerenciamento e análise de dados por bioinformática / Not available

Pablo Rodrigo Sanches 10 October 2006 (has links)
Os projetos para estudo de genomas ou genes expressos partem de uma etapa de seqüenciamento no qual são gerados em laboratório dados brutos, ou seja, seqüências de DNA sem significado biológico. Estas seqüências de DNA possuem códigos responsáveis pela produção de RNAs e proteínas. O grande desafio dos pesquisadores consiste em analisar essas seqüências e obter informações biologicamente relevantes. Durante esta análise diversos programas de computador, além de um grande volume de dados armazenados em fontes de dados biológicas, são utilizados. Assim sendo, o presente trabalho prop6s a elaboração de um sistema computacional que permite a análise de dados sobre biologia molecular e facilite a instanciação do software dependendo do ambiente de trabalho e tipo de projeto de análise. Para este sistema foi dado o nome de Sistema de Gerenciamento de Análise de Dados por Bioinformática - SGADBio. O trabalho apresenta o desenvolvimento do sistema baseado em metodologias de Engenharia de Software, além dos módulos e funções disponíveis. Seqüências oriundas de um projeto de ESTs do fungo dermatófito Trichophyton rubrum, geradas em um laboratório de biologia molecular, foram submetidas ao sistema para análise. Os resultados são expressivos, demonstrando que o sistema é adequado e capaz de adaptar se a projetos envolvendo seqüenciamento / Projects involving the study of genomes and expressed genes typically initiate with the raw data generated by laboratory sequencing of DNA, devoid of any biological meaning. However, such sequences contain the codes for the production of RNAs and proteins. One of the great challenges faced by researchers is the analysis of such sequences in order to obtain biologically meaningful information. Several computer programs and auxiliary databases are used for that purpose. The present work reports on the development of a computational system capable of supporting biology data analysis and it can be instantiated in order to suit specific working environments and analyses projects. This system has been called Management and Data Analysis System for Applications in Bioinformatics- SGADBio. This work presents the development of the system based on Software Engineering methodologies, as well as the involved modules and functionalities. Sequences from an EST project involving the dermatophyte fungus Trichophyton rubrum, generated in a molecular biology laboratory, were submitted to the system for analysis. The results are expressive, corroborating the versatility of the system for adaptation to sequencing projects

Page generated in 0.047 seconds