• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2913
  • 276
  • 199
  • 187
  • 160
  • 82
  • 48
  • 29
  • 25
  • 21
  • 19
  • 15
  • 14
  • 12
  • 12
  • Tagged with
  • 4944
  • 2921
  • 1294
  • 1093
  • 1081
  • 808
  • 743
  • 736
  • 551
  • 545
  • 541
  • 501
  • 472
  • 463
  • 456
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
361

Skreening schopnosti hlubinné mikroflóry rozkládat ropné látky / Screening of possibilities of deep subsurface microflora to decompose selected organic compounds

Kuanysheva, Assel January 2013 (has links)
Screening of possibilities of deep subsurface microflora to decompose selected organic compounds Abstract The aim of the study is to test the deep microflora bacterial strains for their ability to grow in oily environment, aliphatic hydrocarbons and toluene were taken as examples of aromatic hydrocarbons and where the cultivation of selected strain, were produced for testing its growth and microbial activity of selected strains in conditions simulating soil conditions; assess the usability these strains in practical remediation of contamination by oil. This thesis deals with the evaluation of possible use of selected strains of deep microflora for oil decomposition. It is evident, that some groups of microorganisms living in the Tertiary claystones at depths of 30-450 m below the surface are the biodegradable fossil organic matter type of kerogen. Chemical findings indicate that, this organic matter consists of various lengths of aliphatic chains, and thus the assumption that microorganisms decomposing kerogen might be able to disassemble oil and petroleum products. The findings of our experiment indicate that benzene and toluene, as well as kerogen are highly resistant to organic compounds and evidence of microbial degradation are rare. Utilization of oil as representative aliphatic compounds is better...
362

Measurement of high Q² charged-current deep inelastic scattering with polarised positron beams using the ZEUS detector at HERA

Oliver, Katie Rosemarie January 2011 (has links)
This thesis presents measurements of charged current deep inelastic scattering cross sections in e+p collisions with longitudinally polarised positron beams. The measurements are based on data taken by the ZEUS detector at the HERA collider during the 2006-2007 run- ning period. The data sample has an integrated luminosity of 132 pb-1 and was taken at a centre-of-mass energy of 318 GeY. The total cross section has been measured at positive and negative values of the longitu- dinal polarisation of the positron beam (Pe). In addition, the single differential cross sections dδ / dQ2, dδ / dx and dδ / dy have been measured for Q2 > 200 Ge y2, also using both positively and negatively polarised positron beams. The reduced cross section has been measured in nine bins of Q2 in the kinematic range 280 < Q2 < 30000 Gey2 and 0.0078 < x < 0.42. The results are compared against the descriptions provided by the CTEQ6.6, MSTW 2008, HEARPDF1.0 and ZEUS-JETS PDFs. In general, the measured cross sections are well described by these predictions. Based on the measurement of the total cross section as a function of the polarisation of the positron beam, a lower limit on the mass of a hypothetical right-handed W boson has been extracted from the upper limit of the cross-section at Pe = -1. This limit is complementary to the limits obtained from direct searches (for example at CDF and D0) because the limit presented herein is for a space-like vV, whereas for direct searches, the limit on the mass of a time-like W boson is obtained. The results of this analysis have been published and have been included ill the determination of the HERAPDF theoretical prediction and also in HI and ZEUS combined results.
363

A Geological Interpretation of 3D Seismic Data of a Salt Structure and Subsalt Horizons in the Mississippi Canyon Subdivision of the Gulf of Mexico

Mejias, Mariela 22 May 2006 (has links)
The Gulf of Mexico (GOM) represents a challenge for exploration and production. Most of the sediments coming from North America has bypassed the shelf margin into Deep Water. In an Attempt to attack this challenge this thesis pretends to break the GOM's false bottom, mainly comprised by diverse salt structures and growth fault families. In this attempt, geological and geophysical data are integrated to find clues to potential hydrocarbons indicator (PHI) that could be of Reservoir Quality (RQ). 3D Pre stack depth migrated data comprised of Mississippi Canyon blocks, were interpreted: Top and base of salt, leading to the identification of a PHI represented by a consistent Amplitude Anomaly (AA) below and towards a salt structure. This AA may be of RQ and feasibility evaluation for further decisions may be taken. Following the structural sequences that Govern central GOM during Oligocene through out Miocene was important to support the results.
364

Packaging Demand Forecasting in Logistics using Deep Neural Networks

Bachu, Yashwanth January 2019 (has links)
Background: Logistics have a vital role in supply chain management and those logistics operations are dependent on the availability of packaging material for packing goods and material to be shipped. Forecasting packaging material demand for a long period of time will help organization planning to meet the demand. Using time-series data with Deep Neural Networks for long term forecasting is proposed for research. Objectives: This study is to identify the DNN used in forecasting packaging demand and in similar problems in terms of data, data similar to the available data with the organization (Volvo). Identifying the best-practiced approach for long-term forecasting and then combining the approach with identified and selected DNN for forecasting. The end objective of the thesis is to suggest the best DNN model for packaging demand forecasting. Methods: An experiment is conducted to evaluate the DNN models selected for demand forecasting. Three models are selected by a preliminary systematic literature review. Another Systematic literature review is performed in parallel for identifying metrics to evaluate the models to measure performance. Results from the preliminary literature review were instrumental in performing the experiment. Results: Three models observed in this study are performing well with considerable forecasting values. But based on the type and amount of historical data that models were given to learn, three models have a very slight difference in performance measures in terms of forecasting performance. Comparisons are made with different measures that are selected by the literature review. For a better understanding of the batch size impact on model performance, experimented three models were developed with two different batch sizes. Conclusions: Proposed models are performing considerable forecasting of packaging demand for planning the next 52 weeks (∼ 1 Year). Results show that by adopting DNN in forecasting, reliable packaging demand can be forecasted on time series data for packaging material. The combination of CNN-LSTM is better performing than the respective individual models by a small margin. By extending the forecasting at the granule level of the supply chain (Individual suppliers and plants) will benefit the organization by controlling the inventory and avoiding excess inventory.
365

Watermarking in Audio using Deep Learning

Tegendal, Lukas January 2019 (has links)
Watermarking is a technique used to used to mark the ownership in media such as audio or images by embedding a watermark, e.g. copyrights information, into the media. A good watermarking method should perform this embedding without affecting the quality of the media. Recent methods for watermarking in images uses deep learning to embed and extract the watermark in the images. In this thesis, we investigate watermarking in the hearable frequencies of audio using deep learning. More specifically, we try to create a watermarking method for audio that is robust to noise in the carrier, and that allows for the extraction of the embedded watermark from the audio after being played over-the-air. The proposed method consists of two deep convolutional neural network trained end-to-end on music with simulated noise. Experiments show that the proposed method successfully creates watermarks robust to simulated noise with moderate quality reductions, but it is not robust to the real world noise introduced after playing and recording the audio over-the-air.
366

Attributed Multi-Relational Attention Network for Fact-checking URL Recommendation

You, Di 11 July 2019 (has links)
To combat fake news, researchers mostly focused on detecting fake news and journalists built and maintained fact-checking sites (e.g., Snopes.com and Politifact.com). However, fake news dissemination has been greatly promoted by social media sites, and these fact-checking sites have not been fully utilized. To overcome these problems and complement existing methods against fake news, in this thesis, we propose a deep-learning based fact-checking URL recommender system to mitigate impact of fake news in social media sites such as Twitter and Facebook. In particular, our proposed framework consists of a multi-relational attentive module and a heterogeneous graph attention network to learn complex/semantic relationship between user-URL pairs, user-user pairs, and URL-URL pairs. Extensive experiments on a real-world dataset show that our proposed framework outperforms seven state-of-the-art recommendation models, achieving at least 3~5.3% improvement.
367

Ressuage des matériaux cimentaires : origine physique et changement d'échelle / Bleeding of cementitious materials

Massoussi, Nadia 10 October 2017 (has links)
Au vu de la différence de densité entre les composants minéraux solides et l’eau entrant dans la composition d’un béton, une instabilité gravitaire peut apparaître et provoquer une séparation de phase. Cette séparation est à l’origine de la formation d’une pellicule d’eau à la surface du béton et est appelé ressuage. Malgré le fait que le ressuage peut directement ou indirectement nuire aux propriétés finales du béton durci, les connaissances existantes ne permettent pas de prédire ce phénomène ou de le corréler à la formulation du béton. L’objectif de cette thèse est d’identifier la physique mise en jeu lors du phénomène de ressuage de façon à proposer à la fois une méthodologie de mesure adaptée et un cadre théorique prédictif.La démarche retenue consiste à commencer par l’étude d’un matériau simple tel qu’une pâte de ciment en laboratoire pour terminer à l’échelle plus complexe d’un béton de fondation coulé sur chantier.Dans une première partie, nos résultats expérimentaux sur pâte de ciment suggèrent que le ressuage ne peut pas être considéré comme un simple phénomène de consolidation homogène d'un matériau poreux déformable mais comme un phénomène de consolidation hétérogène conduisant à la formation de canaux préférentiels d'extraction d'eau. Nous montrons ainsi l'existence de trois régimes de ressuage : une période d'induction, une période d’accélération et une période de consolidation. Seuls les deux derniers régimes avaient été observés et discutés jusqu'à maintenant dans la littérature. Nos résultats suggèrent que la formation de ces canaux préférentiels semble être initiée par les défauts du système (les bulles d’air au premier ordre).Dans une seconde partie, les deux essais normalisés utilisés à ce jour dans la pratique industrielle pour la mesure du ressuage des bétons sur chantier, l’essai ASTM et l’essai Bauer, sont étudiés. Nous montrons que ces essais capturent des aspects différents du ressuage et qu’ils ne peuvent donc être corrélés. Nous montrons par ailleurs l’existence de limites dans la capacité de ces essais à capturer le risque de ressuage pour un béton donné. Des modifications de protocole sont alors proposées pour améliorer ces essais et leur permettre de fournir les données nécessaires à la prédiction du ressuage à l’échelle de la fondation.Enfin, nous étudions à la fois les différences entre ressuage d’une pâte de ciment et ressuage d’un béton et l’influence de la hauteur totale de matériau soumis au ressuage. La forte dépendance de la vitesse de ressuage à la profondeur est mise en évidence dans le cas des bétons. Un modèle permettant d’extrapoler une vitesse de ressuage dans une fondation à partir d’une mesure de ressuage à l’aide de l’essai ASTM est proposé. Ce modèle est validé sur des essais de laboratoire et des fondations réelles.Mots clés : ressuage, béton, pâte de ciment, consolidation, effet d’échelle / Due to the density differences between the solid mineral components and the suspending water, gravity can induce phase separation in concrete. This phase separation is at the origin of the formation of a film of water on the upper surface of fresh concrete, commonly known as bleeding. Although bleeding is known to directly or indirectly affect the final properties of hardened concrete, the existing knowledge does not allow for the prediction of this phenomenon or its correlation to mix proportions. The objective of this thesis, therefore, is to identify the physics behind the bleeding phenomenon in order to propose both an adapted measurement methodology and a predictive theoretical framework.The approach adopted is to start from the study of a simple model material, a cement paste in the laboratory, and upscale to the more complex scale of concrete poured into a real foundation on site.In the first part, our experimental results on cement paste suggest that bleeding cannot be simply described as the consolidation of a soft porous material, but, in fact, is of an obvious heterogeneous nature leading to the formation of preferential water extraction channels within the cement paste. We thus show the existence of three bleeding regimes: an induction period, an acceleration period, and a consolidation period. Only the last two regimes had been observed and discussed in the literature. Our results suggest that the formation of these preferential channels seems to be initiated by system defects (air bubbles at first order).In the second part, the two industrial standard tests used for the measurement of bleeding on site, the ASTM test and the Bauer test, are studied. We show that these tests capture different aspects of bleeding, and therefore, cannot be correlated. We also show the existence of limits in the capacity of these tests to capture the risk of bleeding for a given concrete. Changes and improvements are proposed in order to enable these tests to provide the data necessary for the prediction of bleeding at the concrete foundation scale.Finally, in the last part, we study the differences between the bleeding of a cement paste and the bleeding of a concrete and the influence of the total height of material subjected to bleeding. The high dependence of the bleeding rate on the depth of the foundation is captured in the case of concretes. A model is proposed to extrapolate a bleeding rate in a foundation from a bleeding measurement using the ASTM test. This model is validated on laboratory tests and on onsite measurements of real concrete foundations.Keywords: bleeding, concrete, cement paste, consolidation, scale effect
368

Taxonomy and ecology of the deep-pelagic fish family Melamphaidae, with emphasis on interactions with a mid-ocean ridge system

Unknown Date (has links)
Much of the world's oceans lie below a depth of 200 meters, but very little is known about the creatures that inhabit these deep-sea environments. The deep-sea fish family Melamphaidae (Stephanoberyciformes) is one such example of an understudied group of fishes. Samples from the MAR-ECO (www.mar-eco.no) project represent one of the largest melamphaid collections, providing an ideal opportunity to gain information on this important, but understudied, family of fishes. The key to the family presented here is the first updated, comprehensive key since those produced by Ebeling and Weed (1963) and Keene (1987). Samples from the 2004 MAR-ECO cruise and the U.S. National Museum of Natural History provided an opportunity to review two possible new species, the Scopelogadus mizolepis subspecies, and a Poromitra crassiceps species complex. Results show that Scopeloberyx americanus and Melamphaes indicoides are new species, while the two subspecies of Scopelogadus mizolepis are most likely o nly one species and the Poromitra crassiceps complex is actually several different species of Poromitra. Data collected from the MAR-ECO cruise provided an opportunity to study the distribution, reproductive characteristics and trophic ecology of the family Melamphaidae along the Mid-Atlantic Ridge (MAR). Cluster analysis showed that there are five distinct groups of melamphaid fishes along the MAR. This analysis also supported the initial observation that the melamphaid assemblage changes between the northern and southern edges of an anti-cyclonic anomaly that could be indicative of a warm-core ring. Analysis of the reproductive characteristics of the melamphaid assemblage revealed that many of the female fishes have a high gonadosomatic index (GSI) consistent with values found for other species of deep-sea fishes during their spawning seasons. / This may indicate that melamphaids use this ridge as a spawning ground. Diets of the melamphaid fishes were composed primarily of ostracods, a mphipods, copepods and euphausiids. Scopelogadus was the only genus shown to have a high percent of gelatinous prey in their digestive system, while Melamphaes had the highest concentration of chaetognaths. This work presents data on the ecology and taxonomy of the family Melamphaidae and provides a strong base for any future work on this biomass-dominant family of fishes. / by Kyle Allen Bartow. / Thesis (Ph.D.)--Florida Atlantic University, 2010. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2010. Mode of access: World Wide Web.
369

3D Visualization of MPC-based Algorithms for Autonomous Vehicles

Sörliden, Pär January 2019 (has links)
The area of autonomous vehicles is an interesting research topic, which is popular in both research and industry worldwide. Linköping university is no exception and some of their research is based on using Model Predictive Control (MPC) for autonomous vehicles. They are using MPC to plan a path and control the autonomous vehicles. Additionally, they are using different methods (for example deep learning or likelihood) to calculate collision probabilities for the obstacles. These are very complex algorithms, and it is not always easy to see how they work. Therefore, it is interesting to study if a visualization tool, where the algorithms are presented in a three-dimensional way, can be useful in understanding them, and if it can be useful in the development of the algorithms.  This project has consisted of implementing such a visualization tool, and evaluating it. This has been done by implementing a visualization using a 3D library, and then evaluating it both analytically and empirically. The evaluation showed positive results, where the proposed tool is shown to be helpful when developing algorithms for autonomous vehicles, but also showing that some aspects of the algorithm still would need more research on how they could be implemented. This concerns the neural networks, which was shown to be difficult to visualize, especially given the available data. It was found that more information about the internal variables in the network would be needed to make a better visualization of them.
370

Skin lesion segmentation and classification using deep learning

Unknown Date (has links)
Melanoma, a severe and life-threatening skin cancer, is commonly misdiagnosed or left undiagnosed. Advances in artificial intelligence, particularly deep learning, have enabled the design and implementation of intelligent solutions to skin lesion detection and classification from visible light images, which are capable of performing early and accurate diagnosis of melanoma and other types of skin diseases. This work presents solutions to the problems of skin lesion segmentation and classification. The proposed classification approach leverages convolutional neural networks and transfer learning. Additionally, the impact of segmentation (i.e., isolating the lesion from the rest of the image) on the performance of the classifier is investigated, leading to the conclusion that there is an optimal region between “dermatologist segmented” and “not segmented” that produces best results, suggesting that the context around a lesion is helpful as the model is trained and built. Generative adversarial networks, in the context of extending limited datasets by creating synthetic samples of skin lesions, are also explored. The robustness and security of skin lesion classifiers using convolutional neural networks are examined and stress-tested by implementing adversarial examples. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2018. / FAU Electronic Theses and Dissertations Collection

Page generated in 0.0312 seconds