• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 342
  • 129
  • 63
  • 34
  • 33
  • 22
  • 15
  • 8
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 810
  • 90
  • 88
  • 79
  • 64
  • 53
  • 49
  • 48
  • 47
  • 46
  • 45
  • 45
  • 44
  • 44
  • 43
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

Adopting Observability-Driven Development for Cloud-Native Applications : Designing End-to-end Observability Pipeline using Open-source Software / Anta observerbarhetsdriven utveckling för molnbaserade applikationer : En skalbar öppen källkodspipeline och arkitektur

Ni, Chujie January 2023 (has links)
As cloud-native applications become more distributed, complex, and unpredictable with the adoption of microservices and other new architectural components, traditional monitoring solutions are inadequate in providing end-to-end visibility and proactively identifying deviations from expected behaviour before they become disruptive to services. In response to these challenges, observability-driven development (ODD) is proposed as a new methodology that leverages tools and practices to observe the state and detect the behaviour of systems. Unlike the leading IT giants developing their proprietary tools and platforms, some non-IT companies and smaller organizations still have difficulty adopting observability-driven development. Proprietary development demands extensive resources and manpower, while connecting to third-party platforms may compromise data security. This thesis proposed an end-to-end observability pipeline that is composed of merely open-source components. The pipeline collects and correlates metrics, logs, and traces to facilitate software development and help troubleshoot in production. The pipeline is designed to be adaptive and extensible so that companies can adopt it as the first step towards observability-driven development, and customize it to meet their specific requirements. / Molnbaserade applikationer blir alltmer distribuerade, komplexa och oförutsägbara med införandet av mikrotjänster och andra nya arkitektoniska komponenter. Detta resulterar i att traditionella övervakningslösningar blir alltmer inadekvata. De traditionella lösningarna tillhandahåller inte tillräcklig överskådlighet över dessa applikationer (end-to-end) för proaktiv identifiering av avvikelser från förväntat beteende innan de börjar påverka tjänsterna negativt. Som svar på dessa utmaningar föreslås observerbarhetsdriven utveckling (ODD) som en ny metod som utnyttjar verktyg och praxis för att observera tillståndet och upptäcka systemens beteende. Till skillnad från de ledande IT-jättarna som utvecklar sina egna verktyg och plattformar, har vissa icke-IT-företag och mindre organisationer fortfarande svårt att ta till sig observerbarhetsdriven utveckling. Egenutvecklad mjukvara kräver omfattande resurser och arbetskraft, medan anslutning till tredjepartsplattformar kan äventyra datasäkerheten. Den här avhandlingen bidrar med en end-to-end lösning som enbart baserats på öppen källkod. Pipelinen samlar in data från loggar och korrelerar dessa mätvärden för att underlätta mjukvaruutveckling och hjälpa till att felsöka i produktionen. Pipelinen är designad för att vara anpassningsbar och utvidgningsbar så att företag kan använda den som ett första steg mot observerbarhetsdriven utveckling och anpassa den för att möta deras specifika krav.
462

Object detection for autonomous trash and litter collection / Objektdetektering för autonom skräpupplockning

Edström, Simon January 2022 (has links)
Trashandlitter discarded on the street is a large environmental issue in Sweden and across the globe. In Swedish cities alone it is estimated that 1.8 billion articles of trash are thrown to the street each year, constituting around 3 kilotons of waste. One avenue to combat this societal and environmental problem is to use robotics and AI. A robot could learn to detect trash in the wild and collect it in order to clean the environment. A key component of such a robot would be its computer vision system which allows it to detect litter and trash. Such systems are not trivially designed or implemented and have only recently reached high enough performance in order to work in industrial contexts. This master thesis focuses on creating and analysing such an algorithm by gathering data for use in a machine learning model, developing an object detection pipeline and evaluating the performance of that pipeline based on varying its components. Specifically, methods using hyperparameter optimisation, psuedolabeling and the preprocessing methods tiling and illumination normalisation were implemented and analysed. This thesis shows that it is possible to create an object detection algorithm with high performance using currently available state-of-the-art methods. Within the analysed context, hyperparameter optimisation did not significantly improve performance and psuedolabeling could only briefly be analysed but showed promising results. Tiling greatly increased mean average precision (mAP) for the detection of small objects, such as cigarette butts, but decreased the mAP for large objects and illumination normalisation improved mAPforimagesthat were brightly lit. Both preprocessing methods reduced the frames per second that a full detector could run at whilst psuedolabeling and hyperparameter optimisation greatly increased training times. / Skräp som slängs på marken har en stor miljöpåverkan i Sverige och runtom i världen. Enbart i Svenska städer uppskattas det att 1,8 miljarder bitar skräp slängs på gatan varje år, bestående av cirka 3 kiloton avfall. Ett sätt att lösa detta samhälleliga och miljömässiga problem är att använda robotik och AI. En robot skulle kunna lära siga att detektera skräp i utomhusmiljöer och samla in den för att på så sätt rengöra våra städer och vår natur. En nyckelkomponent av en sådan robot skulle vara dess system för datorseende som tillåter den att se och hitta skräp. Sådana system är inte triviala att designa eller implementera och har bara nyligen påvisat tillräckligt hög prestanda för att kunna användas i kommersiella sammanhang. Detta masterexamensarbete fokuserar på att skapa och analysera en sådan algoritm genom att insamla data för att använda i en maskininlärningsmodell, utveckla en objektdetekterings pipeline och utvärdera prestandan när dess komponenter modifieras. Specifikt analyseras metoderna pseudomarkering, hyperparameter optimering samt förprocesseringsmetoderna kakling och ljusintensitetsnormalisering. Examensarbetet visar att det är möjligt att skapa en objektdetekteringsalgoritm med hög prestanda med hjälp av den senaste tekniken på området. Inom det undersökta sammanhanget gav hyperparameter optimering inte någon större förbättring av prestandan och pseudomarkering kunde enbart ytligt analyseras men uppvisade preliminärt lovande resultat. Kakling förbättrade resultatet för detektering av små objekt, som cigarettfimpar, men minskade prestandan för större objekt och ljusintensitetsnormalisering förbättrade prestandan för bilder som var starkt belysta. Båda förprocesseringsmetoderna minskade bildhastigheten som en detektor skulle kunna köra i och psuedomarkering samt hyperparameter optimering ökade träningstiden kraftigt.
463

Racial Differences In Juvenile Court Delinquency Outcomes in a Large Urban County in a Midwestern State

Brown, Richard, III 02 April 2014 (has links)
No description available.
464

Modelling and Simulation of Carbon Dioxide Transportation in Pipelines: Effects of Impurities

Peletiri, Suoton P. January 2020 (has links)
Carbon dioxide capture, transportation, and storage has been identified as the most promising way to reduce anthropogenic carbon dioxide (CO2) released into the atmosphere. Efforts made to achieve this purpose include the Paris (Climate) Accord. This agreement seeks to encourage countries to take the issue of rising global temperatures seriously. With nearly all countries signing this agreement, many CCTS projects are expected. Pipelines are employed in the transportation of CO2. CO2 fluids contain impurities that affect the fluid properties and flow dynamics, but pipelines are mostly designed assuming that the CO2 fluid is pure. CO2 pipeline fluids contain at least 90 % CO2 with the balance made up of impurities. The impurities include nitrogen, methane, oxygen, hydrogen, sulphur dioxide, hydrogen sulphide, carbon monoxide, ammonia, argon, etc. The effects of the impurities are studied using simulation software; Aspen HYSYS, gPROMS and HydraFlash. The results show that all impurities impacted negatively on transportation. At equal concentrations, hydrogen had the greatest effect on fluid properties and hydrogen sulphide the least impact. At the specified allowable concentration, nitrogen had the worst effect on pressure loss (32.1 %) in horizontal pipeline, density, and critical pressure. Carbon monoxide (with only 0.2-mol %) had the smallest effect in pressure drop (0.3 %). Analysis of supercritical and subcritical (or liquid) CO2 fluid transportation shows that subcritical fluids have higher densities (more volume transported) and lower pressure losses than supercritical fluids. Subcritical fluid transportation would therefore have lower pipeline transportation costs than supercritical fluids. Also, soil heat conductivity has greater effect than ambient temperature in buried pipelines. Simple equations that approximate binary CO2 fluid properties from pure CO2 properties were developed and presented.
465

Unpacking the Discipline Gap: Referral Categories and School-Wide Positive Behavior Interventions and Supports

Barclay, Christopher Michael 14 October 2015 (has links)
Despite decades of efforts to racially integrate schools and the recent accountability movement, U.S. students’ access to equitable education remains elusive. Research demonstrates that discipline procedures disproportionately remove racial minority students from the classroom, creating a “discipline gap.” Racial disparities in discrete disciplinary infraction types (e.g., disruption, aggression) have shown nuanced patterns across groups and school levels. Moreover, the relationship between school-wide positive behavior interventions and supports (SWPBIS) – a framework for promoting positive behavior and preventing conflict – and the discipline gap is unclear. This investigation explored racial/ethnic disparities per infraction type (e.g. disruption, verbal abuse) and the relationship of SWPBIS implementation fidelity to these referrals using multilevel logistic regression analyses. Participants were 40 elementary schools receiving PBIS technical assistance and the 24,512 students served by the schools. Findings of disciplinary disparities largely were consistent with previous studies with similar methods. Compared to White peers, Black students were overrepresented in office discipline referrals (ODRs) across all infraction types while Hispanic students were underrepresented in Aggression referrals and other racial/ethnic minority students were underrepresented in Miscellaneous referrals. SWPBIS implementation fidelity demonstrated a significant negative relationship with the overall ODR rate and was significantly related to infractions for Aggression; however, no evidence was produced to support the notion that SWPBIS produces more equitable discipline practices. Implications for the research and practice of culturally responsive behavior supports are discussed.
466

Predictions of explosions and fires of natural gas/hydrogen mixtures for hazard assessment

Mumby, Christopher January 2010 (has links)
The work presented in this thesis was undertaken as part of the safety work package of the NATURALHY project which was an integrated project funded by the European Commission (EC) within the sixth framework programme. The purpose of the NATURALHY project was to investigate the feasibility of using existing natural gas infrastructure to assist a transition to a hydrogen based economy by transporting hydrogen from its place of production to its place of use as a mixture of natural gas and hydrogen. The hydrogen can then be extracted from the mixture for use in fuel cells or the mixture used directly in conventional combustion devices. The research presented in this thesis focused on predicting the consequences of explosions and fires involving natural gas and hydrogen mixtures, using engineering type mathematical models typical of those used by the gas industry for risk assessment purposes. The first part of the thesis concentrated on modifying existing models that had been developed to predict confined vented and unconfined vapour cloud explosions involving natural gas. Three geometries were studied: a confined vented enclosure, an unconfined cubical region of congestion and an unconfined high aspect ratio region of congestion. The modifications made to the models were aimed at accounting for the different characteristics of a natural gas/hydrogen mixture compared to natural gas. Experimental data for the laminar burning velocity of methane/hydrogen mixtures was obtained within the safety work package. For practical reasons, this experimental work was carried at an elevated temperature. Predictions from kinetic modelling were employed to convert this information for use in models predicting explosions at ambient temperature. For confined vented explosions a model developed by Shell (SCOPE) was used and modified by adding new laminar burning velocity and Markstein number data relevant to the gas compositions studied. For vapour cloud explosions in a cubical region of congestion, two models were used. The first model was developed by Shell (CAM2), and was applied using the new laminar burning velocity and other composition specific properties. The second model was based on a model provided by GL Services and was modified by generalising the flame speed model so that any natural gas/hydrogen mixture could be simulated. For vapour cloud explosions in an unconfined high aspect ratio region of congestion, a model from GL Services was used. Modifications were made to the modelling of flame speed so that it could be applied to different fuel compositions, equivalence ratios and the initial flame speed entering the congested region. Predictions from the modified explosion models were compared with large scale experimental data obtained within the safety work package. Generally, (apart from where continuously accelerating flames were produced), satisfactory agreement was achieved. This demonstrated that the modified models could be used, in many cases, for risk assessment purposes for explosions involving natural gas/hydrogen mixtures. The second part of thesis concentrated on predicting the incident thermal radiation from high pressure jet fires and pipelines fires involving natural gas/hydrogen mixtures. The approach taken was to modify existing models, developed for natural gas. For jet fires three models were used. Fuel specific input parameters were derived and the predictions of flame length and incident radiation compared with large scale experimental data. For pipeline fires a model was developed using a multi-point source approach for the radiation emitted by the fire and a correlation for flame length. Again predictions were compared with large scale experimental data. For both types of fire, satisfactory predictions of the flame length and incident radiation were obtained for natural gas and mixtures of natural gas and hydrogen containing approximately 25% hydrogen.
467

Vamzdyno nuogulų tyrimas ir apibūdinimas / The research and characterization of pipeline scales

Verikienė, Vaida 22 June 2010 (has links)
Baigiamajame magistro darbe „Vamzdyno nuogulų tyrimas ir apibūdinimas“, apžvelgus mokslinės literatūros šaltinius, apibūdintos ant vamzdžių vidinių sienelių susidarančios nuogulos, kurių sudėtyje vyrauja geležies junginiai. Nagrinėjama, kokią įtaką vandens kokybei turi ant skirtingų medžiagų vamzdžių vidinių sienelių susidariusios nuogulos. Aptariama, kokios dėl nuogulų susidarymo iškyla problemos vandentiekio sistemoje. Vilniaus Gedimino technikos universiteto Vandentvarkos katedros laboratorijoje buvo atliktas eksperimentas, sumontavus du eksperimentinius stendus. Juos sudarė skirtingų medžiagų (PE, cinkuoto plieno ir seno plieno) vamzdynų sistemos, kuriomis cirkuliavo skirtingos kokybės vanduo. Buvo nustatyta kiekybinė ir kokybinė nuogulų sudėtis. Kiekybinė sudėtis apibūdinta ChDS pokyčiu, esant skirtingai vandens išbuvimo trukmei vamzdyne. Kokybinę sudėtį apibūdina geležies, mangano junginių, amonio jonų, nitritų ir nitratų koncentracijos vandenyje. Tai pat nustatyta vamzdyno nuogulų granuliometrinė sudėtis. Atlikus tyrimus paaiškėjo, kad bendrosios geležies koncentracija mažiausia buvo PE, o didžiausia – seno plieno vamzdyno mėginiuose. PE vamzdyne mangano koncentracija didėjo ilgėjant vandens užsistovėjimo trukmei. Amonio jonų, nitritų ir nitratų kiekiai mėginiuose neviršijo higienos normų. PE vamzdynų nuogulų ChDS rodikliai buvo mažiausi abiejuose stenduose. Palyginus eksperimentinių stendų nuogulų mėginių rezultatus, paaiškėjo, kad ChDS reikšmės priklauso nuo vandens... [toliau žr. visą tekstą] / In the Master‘s Thesis "The research and characterization of pipeline scales" the literature review of scientific sources was described pipeline scales which contains predominantly iron compounds. The impact of water quality on the pipeline scales was investigated. Negative impact of the formation of pipeline scales on water distribution system was analysed. Experimental investigation was carried out in the laboratory scaled equipment at Water Management Department of VGTU. Experimental pipelines consist of different materials (galvanized steel, old steel and polyethylene (PE)) pipe system with circulated different water quality. The quantitative and qualitative composition of the scales was measured. The quantitative composition was described of the change of COD under different water age in to the different pipelines. The qualitative composition was described by iron, manganese compounds, ammonium, nitrite and nitrate concentrations in water. It was also set the particle size content of pipeline scales. Results showed that the lowest iron concentration was in PE pipe samples and the highest in an old steel pipe samples. The concentration of manganese increased with increasing duration of water age in PE pipe. Ammonium ions, nitrite and nitrate levels didn't exceed the requirements of hygiene norm. The lowest COD rates were in PE pipe scales in both laboratory scaled equipments. It was found that COD values depend on the quality of water that circulated in pipeline. It was... [to full text]
468

Des réseaux de processus cyclo-statiques à la génération de code pour le pipeline multi-dimensionnel / From Cyclo-Static Process Networks to Code Generation for Multidimensional Software Pipelining

Fellahi, Mohammed 22 April 2011 (has links)
Les applications de flux de données sont des cibles importantes de l’optimisation de programme en raison de leur haute exigence de calcul et la diversité de leurs domaines d’application: communication, systèmes embarqués, multimédia, etc. L’un des problèmes les plus importants et difficiles dans la conception des langages de programmation destinés à ce genre d’applications est comment les ordonnancer à grain fin à fin d’exploiter les ressources disponibles de la machine.Dans cette thèse on propose un "framework" pour l’ordonnancement à grain fin des applications de flux de données et des boucles imbriquées en général. Premièrement on essaye de paralléliser le nombre maximum de boucles en appliquant le pipeline logiciel. Après on merge le prologue et l’épilogue de chaque boucle (phase) parallélisée pour éviter l’augmentation de la taille du code. Ce processus est un pipeline multidimensionnel, quelques occurrences (ou instructions) sont décalées par des iterations de la boucle interne et d’autres occurrences (instructions) par des iterationsde la boucle externe. Les expériences montrent que l’application de cette technique permet l’amélioration des performances, extraction du parallélisme sans augmenter la taille du code, à la fois dans le cas des applications de flux des donnée et des boucles imbriquées en général. / Applications based on streams, ordered sequences of data values, are important targets of program optimization because of their high computational requirements and the diversity of their application domains: communication, embedded systems, multimedia, etc. One of the most important and difficult problems in special purpose stream language design and implementation is how to schedule these applications in a fine-grain way to exploit available machine resources In this thesis we propose a framework for fine-grain scheduling of streaming applications and nested loops in general. First, we try to pipeline steady state phases (inner loops), by finding the repeated kernel pattern, and executing actor occurrences in parallel as much as possible. Then we merge the kernel prolog and epilog of pipelined phases to move them out of the outer loop. Merging the kernel prolog and epilog means that we shift acotor occurrences, or instructions, from one phase iteration to another and from one outer loop iteration to another, a multidimensional shifting. Experimental shows that our framwork can imporove perfomance, prallelism extraction without increasing the code size, in streaming applications and nested loops in general.
469

Computational Pipeline for Human Transcriptome Quantification Using RNA-seq Data

Xu, Guorong 04 August 2011 (has links)
The main theme of this thesis research is concerned with developing a computational pipeline for processing Next-generation RNA sequencing (RNA-seq) data. RNA-seq experiments generate tens of millions of short reads for each DNA/RNA sample. The alignment of a large volume of short reads to a reference genome is a key step in NGS data analysis. Although storing alignment information in the Sequence Alignment/Map (SAM) or Binary SAM (BAM) format is now standard, biomedical researchers still have difficulty accessing useful information. In order to assist biomedical researchers to conveniently access essential information from NGS data files in SAM/BAM format, we have developed a Graphical User Interface (GUI) software tool named SAMMate to pipeline human transcriptome quantification. SAMMate allows researchers to easily process NGS data files in SAM/BAM format and is compatible with both single-end and paired-end sequencing technologies. It also allows researchers to accurately calculate gene expression abundance scores.
470

[en] DYNAMIC BEHAVIOR OF BURIED PIPES: METHODOLOGY AND COMPUTATIONAL IMPLEMENTATION / [pt] COMPORTAMENTO DINÂMICO DE DUTOS ENTERRADOS: METODOLOGIA E IMPLEMENTAÇÃO COMPUTACIONAL

IGOR EDUARDO OTINIANO MEJIA 31 March 2009 (has links)
[pt] Neste trabalho apresenta-se uma metodologia de análise do comportamento mecânico de dutos enterrados usados no transporte de gás e outros fluidos sujeitos a cargas dinâmicas. Em especial são considerados carregamentos provocados por sismos. Emprega-se uma modelagem em elementos finitos com base em uma discretização com elementos especiais de viga para modelar o duto. Não linearidades geométricas e do material são consideradas numa formulação Lagrangeana total. As equações de equilíbrio são formuladas a partir do principio dos trabalhos virtuais, segundo as componentes de tensão e deformação no elemento viga-duto. A técnica do Módulo Reduzido de Integração Direta (RMDI) é empregada na qual se incorpora o comportamento elasto-plástico do material. Esta abordagem exclui da análise os efeitos do enrugamento nas paredes do duto. As matrizes para resolução por elementos finitos dessas equações são derivadas. Nessa metodologia os efeitos da interação solo-duto são incorporados. O solo é modelado através de elementos bidimensionais considerando um modelo constitutivo Linear- Equivalente, acoplados ao duto por meio de elementos de interface localizados entre o duto e o solo. Finalmente são considerados contornos artificiais amortecidos para possibilitar a representação do problema através de um trecho finito. Foram usados para as análises históricos de acelerações do tipo sismo, entre estes o sismo ocorrido em Pisco-Perú no ano 2007. Desenvolve-se um programa para computador segundo a metodologia apresentada. Finalmente são estudados alguns exemplos com o objetivo de avaliar numericamente os resultados da análise obtidos e formular algumas conclusões sobre o comportamento de dutos enterrados sujeitos a cargas dinâmicas. / [en] This work presents a numerical methodology for the analysis of buried pipes employed by the transport of oil and gas subject to dynamic loads. Emphasis is given to seismic loads. A finite element model based on a special class of beam element for the pipe representation is employed. Both geometric and material nonlinearities are considered in a total Lagrangean formulation. The equilibrium equations are formulated based on the virtual work principle considering the stress and deformation components of the beam-pipe element. The Reduced Modulus Direct Integration (RMDI) technique is employed by which the elasto-plastic material behavior is incorporated. This technique excludes from the analysis the local buckling effects of the pipe walls. The corresponding finite element matrices for this element are obtained. In this methodology the effects of the constant internal pressure as well as the soil-pipe interaction are included. The soil is modeled through two-dimensional elements with material behavior described through a linear equivalent model. Interface elements couple beam-pipe elements with soil elements and account for soil-pipe interaction. Finally silent boundary elements are incorporated to the model to reproduce the semi-infinite boundary conditions in the finite size model. Distributed loads are considered constant with respect to the global axis. Acceleration histories are applied to simulate seismic dynamic loads among which the acceleration histories of the earthquake which occurred in Pisco-Perú in 2007. A finite element computer code is developed according to the methodology presented. Some examples are studied with the objective to evaluate numerically the analysis results and to formulate some conclusions to the behavior of buried pipes subject to seismic loads.

Page generated in 0.0296 seconds