• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 339
  • 129
  • 63
  • 34
  • 33
  • 22
  • 15
  • 8
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 805
  • 90
  • 88
  • 79
  • 61
  • 52
  • 48
  • 48
  • 46
  • 46
  • 45
  • 44
  • 44
  • 42
  • 42
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
441

Computational Analysis of Viruses in Metagenomic Data

Tithi, Saima Sultana 24 October 2019 (has links)
Viruses have huge impact on controlling diseases and regulating many key ecosystem processes. As metagenomic data can contain many microbiomes including many viruses, by analyzing metagenomic data we can analyze many viruses at the same time. The first step towards analyzing metagenomic data is to identify and quantify viruses present in the data. In order to answer this question, we developed a computational pipeline, FastViromeExplorer. FastViromeExplorer leverages a pseudoalignment based approach, which is faster than the traditional alignment based approach to quickly align millions/billions of reads. Application of FastViromeExplorer on both human gut samples and environmental samples shows that our tool can successfully identify viruses and quantify the abundances of viruses quickly and accurately even for a large data set. As viruses are getting increased attention in recent times, most of the viruses are still unknown or uncategorized. To discover novel viruses from metagenomic data, we developed a computational pipeline named FVE-novel. FVE-novel leverages a hybrid of both reference based and de novo assembly approach to recover novel viruses from metagenomic data. By applying FVE-novel to an ocean metagenome sample, we successfully recovered two novel viruses and two different strains of known phages. Analysis of viral assemblies from metagenomic data reveals that viral assemblies often contain assembly errors like chimeric sequences which means more than one viral genomes are incorrectly assembled together. In order to identify and fix these types of assembly errors, we developed a computational tool called VirChecker. Our tool can identify and fix assembly errors due to chimeric assembly. VirChecker also extends the assembly as much as possible to complete it and then annotates the extended and improved assembly. Application of VirChecker to viral scaffolds collected from an ocean meatgenome sample shows that our tool successfully fixes the assembly errors and extends two novel virus genomes and two strains of known phage genomes. / Doctor of Philosophy / Virus, the most abundant micro-organism on earth has a profound impact on human health and environment. Analyzing metagenomic data for viruses has the beneFIt of analyzing many viruses at a time without the need of cultivating them in the lab environment. Here, in this dissertation, we addressed three research problems of analyzing viruses from metagenomic data. To analyze viruses in metagenomic data, the first question needs to answer is what viruses are there and at what quantity. To answer this question, we developed a computational pipeline, FastViromeExplorer. Our tool can identify viruses from metagenomic data and quantify the abundances of viruses present in the data quickly and accurately even for a large data set. To recover novel virus genomes from metagenomic data, we developed a computational pipeline named FVE-novel. By applying FVE-novel to an ocean metagenome sample, we successfully recovered two novel viruses and two strains of known phages. Examination of viral assemblies from metagenomic data reveals that due to the complex nature of metagenome data, viral assemblies often contain assembly errors and are incomplete. To solve this problem, we developed a computational pipeline, named VirChecker, to polish, extend and annotate viral assemblies. Application of VirChecker to virus genomes recovered from an ocean metagenome sample shows that our tool successfully extended and completed those virus genomes.
442

Methodology to Enhance the Reliability of Drinking Water Pipeline Performance Analysis

Patel, Pruthvi Shaileshkumar 25 July 2018 (has links)
Currently, water utilities are facing monetary crises to maintain and expand services to meet the current as well as the future demands. Standard practice in pipeline infrastructure asset management is to collect data and predict the condition of pipelines using models and tools. Water utilities want to be proactive in fixing or replacing the pipes as fixing-when-it-fails ideology leads to increased cost and can affect environmental quality and societal health. There is a number of modeling techniques available for assessing the condition of the pipelines, but there is a massive shortage of methods to check the reliability of the results obtained using different modeling techniques. It is mainly because of the limited data one utility collects and absence of piloting of these models at various water utilities. In general, water utilities feel confident about their in-house condition prediction and failure models but are willing to utilize a reliable methodology which can overcome the issues related to the validation of the results. This paper presents the methodology that can enhance the reliability of model results for water pipeline performance analysis which can be used to parallel the output of the real system with confidence. The proposed methodology was checked using the dataset of two large water utilities and was found that it can potentially help water utilities gain confidence in their analyses results by statistically signifying the results. / Master of Science
443

Bioflow: A web based workflow management system for design and execution of genomics pipelines

Puthige, Ashwin Acharya 11 January 2014 (has links)
The cost required for the process of sequencing genomes has decreased drastically in the last few years. The knowledge of full genomes has increased the pace of the advancements in the field of functional genomics. Computational genomics, which analyses these sequences, has seen a similar growth. The multitude of sequencing technologies has resulted in various formats for storing the sequences. This has resulted in the creation of many tools for DNA analysis. There are various tools for sorting, indexing, analyzing read groups and other tasks. The analysis of genomics often requires the creation of pipelines, which processes the DNA sequences by chaining together many tools. This results in the creation of complex scripts that glue together these tools and pass the output from one stage to the other. Also, there are tools which allow creation of these pipelines with a graphical user interface. But these are complex to use and it is difficult to quickly add the new tools being developed to existing workflows. To solve these issues, we developed BioFlow; a web based genomic workflow management system. The use of BioFlow does not require any programming skills. The integrated workflow designer allows creation and saving workflows. The pipeline is created by connecting the tools with a visual connector. BioFlow provides an easy and simple interface that allows users to quickly add tools for use in any workflow. Audit logs are maintained at each stage, which helps users to easily identify errors and fix them. / Master of Science
444

Homegrown Teacher Project: Developing an Early Intervention Pipeline for Teachers of Color

Moreno, Yadira 01 January 2018 (has links) (PDF)
The dissertation aims to explore a solution to address the cultural and racial gap between the teaching force and the student population in California. Homegrown teachers are teachers who return to their community where they were born and educated. Addressing the equity issues faced in public schools begins with exploring the benefits of teachers of color in the classroom. This action research study followed five homegrown first-generation Latina teachers through a 3-month process of mentoring first-generation Latina sixth-graders who hope of entering the teaching profession in the future. The study was guided by critical pedagogy, a mentoring framework, the critical mentoring strategy in addition to social capital theory. This dissertation documented the voices of the participants as they developed their mentoring relationship in the early intervention teacher pipeline. The challenges and experiences were documented through observations, researcher’s reflection, semistructured interviews, and a focus group. The study revealed that, with appropriate preparation, students of color are more likely to choose a teaching career and return to their community to become homegrown teachers. The emerging themes of the study were that (a) culture and language shaped the mentoring relationship, (b) homegrown teachers were essential to mentoring students of color, (c) for Latinos, education was a family journey, (d) socializing students of color into career aspirations, (e) acculturation into the teaching profession—learning to become a teacher, and (f) time and gender were the major constraints; redefining future mentoring relationships. This action research revealed the many benefits for teachers and students to develop critical mentoring relationships.
445

No School Left Behind: Oakland Unified School District Discipline Reform and Policy Implementation Case Study

Segura Betancourt, Maria Alejandra 22 June 2023 (has links)
This paper critically evaluates school discipline reform policy and implementation by California in the Oakland Unified School District after the U.S. Department of Education, Office for Civil Rights investigation. It demonstrates that policy implementation at the school level is equally as important as policy building and reform at the state-and district level. The Oakland Unified School district was subject to many reforms at the district level through change in state-wide legislation, and school board reform after the investigation concluded with several recommendations for the district. This provides a unique opportunity to study policy implementation at the school-level to understand how school environment and discretion may affect reform implementation. As research surrounding the effects of punitive school discipline continue to support alternative discipline practices, many states and school-districts have begun to implement its own reform. However, school discretion on how these policies are implemented call for researchers to focus on the school-level of policy implementation. This thesis is motivated to create an understanding in how policy implementation at the state and district level will differ across schools in the same district, focusing on school environment can influence implementation. / Master of Arts / This paper evaluates policy implementation in a California School District as a school-level. In 2012, the Department of Education Office for Civil Rights conducted an investigation in California's Oakland Unified School District on reports of the district subjugating students of minority status to harsher punitive punishment than those of their white peers. The Office for Civil Rights found evidence to support this claim and suggested many disciplines policy and practices reform to the district, which the district began to implement throughout its schools. This paper focuses on reviewing state-wide and district-wide discipline reform by comparing two high schools who experienced a difference in suspensions after reform was implemented. I offer insight into policy implementation by focusing on school environment through mission and vision statements. I perform my analysis through a comparative case study analysis of the two schools as well as content analysis of the state policy and district level policies and practices discussing school discipline. This paper emphasizes that school policy reform at the state and district level is important, however; policy implementation at the school-level ultimately creates change and is affected by school environment.
446

Domain Specific Language (DSL) visualisation for Big Data Pipelines

Mitrovic, Vlado January 2024 (has links)
With the grow of big data technologies, it has become challenging to design and manage complex data workflow, especially for non technical person. However, in order to understand and process these data the best way, we need to rely on domain expert who are often not familiar with tools available on the market. This thesis discovers the needs and describe the implementation of an easy to use tool to define and visualise data processing workflow. The research methodology includes the definition of customer requirements, architecture design, prototype development and user testing. The iterative approach used in this project ensure continuous improvement based on users feedback. The final solution then assessed using KPI metrics such as usability, integration, performances and support. / Med den växande big data-tekniken har det blivit en utmaning att utforma och hantera komplexa dataarbetsflöden, särskilt för icke-tekniska personer. För att förstå och bearbeta dessa data på bästa sätt måste vi dock förlita oss på domänexperter som ofta inte är bekanta med de verktyg som finns tillgängliga på marknaden. Denna avhandling identifierar behoven och beskriver implementeringen av ett lättanvänt verktyg för att definiera och visualisera arbetsflödet för databehandling. Detta genom att abstrahera de tekniska krav som krävs av andra lösningar. Forskningsmetoden omfattar definition av kundkrav, arkitekturdesign, prototyputveckling och användartestning. Det iterativa tillvägagångssätt som används i detta projekt säkerställer kontinuerlig förbättring baserat på användarnas feedback. Den slutliga lösningen utvärderas sedan med hjälp av nyckeltal som användbarhet, integration, prestanda och support.
447

DevOpsSec i praktiken : En studie av acceptans till implementering av säkerhetsåtgärder mot Poisoned Pipeline Execution

Jönsson, Adam, Meyer, Jesper January 2024 (has links)
Denna studie undersöker hur säkerhetsåtgärder designade för att förhindra Poisoned Pipeline Execution (PPE) attacker påverkar utvecklare som arbetar inom DevOps-team. Studien använder Technology Acceptance Model (TAM) för att analysera hur dessa åtgärder påverkar utvecklares uppfattning om användbarhet, användarvänlighet och deras intentioner att implementera och använda dem. Studiens syfte är att förstå de utmaningar och möjligheter som är förknippade med att integrera säkerhetsåtgärder i DevOps-arbetsflöden, för att säkerställa att de stärker säkerheten utan att hindra produktivitet eller samarbete. Genom en kvalitativ ansats genomfördes semistrukturerade intervjuer med tre erfarna utvecklare från olika organisationer. Tematisk analys visade att bristande medvetenhet om DevOpsSec och dess risker utgör en betydande utmaning. Medan utvecklare erkänner vikten av säkerhet, finns det oro för hur nya åtgärder kan påverka effektivitet och flexibilitet. Resultaten belyser behovet av att organisationer integrerar säkerhet i sin DevOps-kultur, främjar delat ansvar och förståelse. Studien betonar också vikten av att automatisera säkerhetsåtgärder, då detta kan öka deras upplevda användbarhet och användarvänlighet. Genom att analysera utvecklares perspektiv ger studien insikter i komplexiteten av att implementera säkerhetsåtgärder inom DevOps och belyser de nödvändiga stegen för att stärka säkerheten utan att hindra effektiviteten. / This study investigates the impact of security measures designed to prevent Poisoned Pipeline Execution (PPE) attacks on developers working within DevOps teams. It leverages the Technology Acceptance Model (TAM) to analyze how these measures influence developers' perceived usefulness, ease of use, and subsequent behavioral intentions regarding implementation and utilization. The research aims to understand the challenges and opportunities associated with integrating security measures into DevOps workflows, ensuring they enhance security without hindering productivity or collaboration. Employing a qualitative approach, the study conducted semi-structured interviews with three experienced developers from different organizations. Thematic analysis revealed that a lack of awareness about DevOpsSec and its associated risks poses a significant challenge. While developers acknowledge the importance of security, concerns about the impact of new measures on efficiency and flexibility arise. The findings highlight the need for organizations to integrate security into their DevOps culture, fostering a shared responsibility and understanding. Additionally, the study emphasizes the importance of automating security measures, as this can increase their perceived usefulness and ease of use. By analyzing developers' perspectives, the study offers insights into the complexities of implementing security measures within DevOps and sheds light on the necessary steps to enhance security without hindering efficiency.
448

NEUTRON SCATTERING STUDIES OF CRUDE OIL VISCOSITY REDUCTION WITH ELECTRIC FIELD

Du, Enpeng January 2015 (has links)
Small-angle neutron scattering (SANS) is a very powerful laboratory technique for micro structure research which is similar to the small angle X-ray scattering (SAXS) and light scattering for microstructure investigations in various materials. In small-angle neutron scattering (SANS) technique, the neutrons are elastically scattered by changes of refractive index on a nanometer scale inside the sample through the interaction with the nuclei of the atoms present in the sample. Because the nuclei of all atoms are compact and of comparable size, neutrons are capable of interacting strongly with all atoms. This is in contrast to X-ray techniques where the X-rays interact weakly with hydrogen, the most abundant element in most samples. The SANS refractive index is directly related to the scattering length density and is a measure of the strength of the interaction of a neutron wave with a given nucleus. It can probe inhomogeneities in the nanometer scale from 1nm to 1000nm. Since the SANS technique probes the length scale in a very useful range, this technique provides valuable information over a wide variety of scientific and technological applications, including chemical aggregation, defects in materials, surfactants, colloids, ferromagnetic correlations in magnetism, alloy segregation, polymers, proteins, biological membranes, viruses, ribosome and macromolecules. Quoting the Nobel committee, when awarding the prize to C. Shull and B. Brockhouse in 1994: “Neutrons tell you where the atoms are and what the atoms do”. At NIST, there is a single beam of neutrons generated from either reactor or pulsed neutron source and selected by velocity selector. The beam passes through a neutron guide then scattered by the sample. After the sample chamber, there are 2D gas detectors to collect the elastic scattering information. SANS usually uses collimation of the neutron beam to determine the scattering angle of a neutron, which results in an even lower signal-to-noise ratio for data that contains information on the properties of a sample. We can analyze the data acquisition from the detectors and get the information on size, shape, etc. This is why we choose SANS as our research tool. The world’s top energy problems are security concerns, climate concerns and environmental concerns. So far, oil (37%) is still the No.1 fuel in world energy consumption (Oil 37%, Coal 25%, Bio-fuels 0.2%, Gas 23%, Nuclear 6%, Biomass 4%, Hydro 3%, Solar heat 0.5%, Wind 0.3%, Geothermal 0.2% and Solar photovoltaic 0.04%). Even more and more alternative energy: bio-fuels, nuclear and solar energy will be used in the future, but nuclear energy has a major safety issue after the Japanese Fukushima I nuclear accidents, and other energies contribute only a small percent. Thus, it is very important to improve the efficiency and reduce the population of petroleum products. There is probably one thing that we can all agree on: the world’s energy reserves are not unlimited. Even though it is limited, only 30% of the oil reserves is conventional oil, so in order to produce, transport, and refine of heavy crude oil without wasting huge amounts of energy, we need to reduce the viscosity without using high temperature stream heating or diluent; As more and more off-shore oil is exploited at that we need reduce the viscosity without increasing temperature. The whole petroleum consumed in U.S. in 2009 was 18.7 million barrels per day and 35% of all the energy we consumed. Diesel is one of the very important fossil fuel which is about 20% of petroleum consumed. Most of the world's oils are non-conventional, 15 % of heavy oil, 25 % of extra heavy oil, 30 % of the oil sands and bitumen, and the conventional oil reserves is only 30%. The oil sand is closely related to the heavy crude oil, the main difference being that oil sands generally do not flow at all. For efficient energy production and conservation, how to lower the liquated fuel and crude oil viscosity is a very important topic. Dr. Tao with his group at Temple University, using his electro or magnetic rheological viscosity theory has developed a new technology, which utilizes electric or magnetic fields to change the rheology of complex fluids to reduce the viscosity, while keeping the temperature unchanged. After we successfully reduced the viscosity of crude oil with field and investigated the microstructure changing in various crude oil samples with SANS, we have continued to reduce the viscosity of heavy crude oil, bunker diesel, ultra low sulfur diesel, bio-diesel and crude oil and ultra low temperature with electric field treatment. Our research group developed the viscosity electrorheology theory and investigated flow rate with laboratory and field pipeline. But we never visualize this aggregation. The small angle neutron scattering experiment has confirmed the theoretical prediction that a strong electric field induces the suspended nano-particles inside crude oil to aggregate into short chains along the field direction. This aggregation breaks the symmetry, making the viscosity anisotropic: along the field direction, the viscosity is significantly reduced. The experiment enables us to determine the induced chain size and shape, verifies that the electric field works for all kinds of crude oils, paraffin-based, asphalt-based, and mix-based. The basic physics of such field induced viscosity reduction is applicable to all kinds of suspensions. / Physics
449

Soudage de polymères semi-cristallins utilisés dans l'isolation de pipeline offshore. Approches thermiques, rhéologiques et mécaniques / Welding of semi-crystalline polymers used in the insulation coating of offshore pipelines. Thermal, rheological and mechanical approaches

Aris-Brosou, Margaux 21 June 2017 (has links)
Cette étude porte sur la caractérisation des matériaux constituant le revêtement isolant de pipeline offshore ainsi que la soudure réalisée entre les deux polymères semi-cristallins du revêtement au niveau de la jonction entre deux tubes successifs.L’épaisseur très importante du revêtement induit, au cours du procédé de soudage, des vitesses hétérogènes de chauffe et de refroidissement des matériaux. Ces dernières ont été caractérisées grâce à une instrumentation du procédé en site industriel. Une modélisation numérique intégrant les phases successives du procédé est en bon accord avec les résultats expérimentaux. Cette modélisation permet de dresser une cartographie complète des champs de température dans l’ensemble du pipeline et plus précisément dans la zone de soudage.Cette étude nous a amené à réaliser une caractérisation des deux matériaux soudés au cours de leurs fusions et cristallisations qui représentent deux étapes cruciales lors du soudage. Une attention particulière a été portée au comportement rhéologique dans la zone de transition entre l’état fondu et l’état solide et inversement. Les données en refroidissement à différentes vitesses ont été corrélées avec le taux de transformation des matériaux.Les propriétés mécaniques des isolants ont été testées ainsi que celles des soudures en prélevant des éprouvettes sur les essais effectués en site industriel. Le peu de flexibilité du procédé industriel rend difficile une investigation de l’influence des paramètres de soudage. Une expérience « image », représentative des grandeurs industrielles, a donc été développée à l’échelle du laboratoire permettant de faire varier les paramètres de soudage. Il a été montré que le point de faiblesse de l’assemblage ne se situe pas au niveau de la soudure mais dans l’un des matériaux du revêtement. / This PhD focuses on the characterization of the materials of the insulating coating of offshore pipelines as well as the welding made between the two semi-crystalline polymers of the coating at the junction of two consecutives pipes.The important thickness of the coating induces heterogeneous heating and cooling rates during the welding process. Those rates have been characterized through the implementation of thermal sensors during the industrial process. A simulation model of the different steps of the welding process is consistent with the experimental results. This simulation gives access to the thermal fields in the entire pipe and especially in the welding zone.This study allows us to characterize the two welded materials during their melting and crystallization which represent the two crucial steps during the welding. A particular attention has been drawn to their rheological behavior in the transition zone from the molten to the solid state and vice versa. The cooling data at different rates have been correlated with the transformation fraction of the materials.The mechanical properties of the insulating materials have been tested especially in the welding zone via the industrial process. However, the imposing infrastructure of the industrial process does not allow the study of the influence of welding parameters. To do so, a “mirror” experiment, representative of the industrial one, has been developed at a laboratory scale. Both the welding made via the industrial process and the “mirror” experiment have shown that the weak point of the structure is not the welding itself but one of the materials of the coating.
450

Wave-Associated Seabed Behaviour near Submarine Buried Pipelines

Shabani, Behnam January 2008 (has links)
Master of Engineering (Research) / Soil surrounding a submarine buried pipeline consolidates as ocean waves propagate over the seabed surface. Conventional models for the analysis of soil behaviour near the pipeline assume a two-dimensional interaction problem between waves, the seabed soil, and the structure. In other words, it is often considered that water waves travel normal to the orientation of pipeline. However, the real ocean environment is three-dimensional and waves approach the structure from various directions. It is therefore the key objective of the present research to study the seabed behaviour in the vicinity of marine pipelines from a three-dimensional point of view. A three-dimensional numerical model is developed based on the Finite Element Method to analyse the so-called momentary behaviour of soil under the wave loading. In this model, the pipeline is assumed to be rigid and anchored within a rigid impervious trench. A non-slip condition is considered to exist between the pipe and the surrounding soil. Quasi-static soil consolidation equations are then solved with the aid of the proposed FE model. In this analysis, the seabed behaviour is assumed to be linear elastic with the soil strains remaining small. The influence of wave obliquity on seabed responses, i.e. the pore pressure and soil stresses, are then studied. It is revealed that three-dimensional characteristics systematically affect the distribution of soil response around the circumference of the underwater pipeline. Numerical results suggest that the effect of wave obliquity on soil responses can be explained through the following two mechanisms: (i) geometry-based three-dimensional influences, and (ii) the formation of inversion nodes. Further, a parametric study is carried out to investigate the influence of soil, wave and pipeline properties on wave-associated pore pressure as well as principal effective and shear stresses within the porous bed, with the aid of proposed three-dimensional model. There is strong evidence in the literature that the failure of marine pipelines often stems from the instability of seabed soil close to this structure, rather than from construction deficiencies. The wave-induced seabed instability is either associated with the soil shear failure or the seabed liquefaction. Therefore, the developed three-dimensional FE model is used in this thesis to further investigate the instability of seabed soil in the presence of a pipeline. The widely-accepted criterion, which links the soil liquefaction to the wave-induced excess pressure is used herein to justify the seabed liquefaction. It should be pointed out that although the present analysis is only concerned with the momentary liquefaction of seabed soil, this study forms the basis for the three-dimensional analysis of liquefaction due to the residual mechanisms. The latter can be an important subject for future investigations. At the same time, a new concept is developed in this thesis to apply the dynamic component of soil stress angle to address the phenomenon of wave-associated soil shear failure. At this point, the influence of three-dimensionality on the potentials for seabed liquefaction and shear failure around the pipeline is investigated. Numerical simulations reveal that the wave obliquity may not notably affect the risk of liquefaction near the underwater pipeline. But, it significantly influences the potential for soil shear failure. Finally, the thesis proceeds to a parametric study on effects of wave, soil and pipeline characteristics on excess pore pressure and stress angle in the vicinity of the structure.

Page generated in 0.0826 seconds