• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 606
  • 285
  • 85
  • 61
  • 40
  • 18
  • 17
  • 16
  • 16
  • 16
  • 15
  • 12
  • 6
  • 5
  • 5
  • Tagged with
  • 1351
  • 236
  • 168
  • 164
  • 140
  • 125
  • 110
  • 109
  • 103
  • 94
  • 91
  • 90
  • 89
  • 82
  • 81
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1141

Algorithmes de recommandation musicale

Maillet, François 12 1900 (has links)
Ce mémoire est composé de trois articles qui s’unissent sous le thème de la recommandation musicale à grande échelle. Nous présentons d’abord une méthode pour effectuer des recommandations musicales en récoltant des étiquettes (tags) décrivant les items et en utilisant cette aura textuelle pour déterminer leur similarité. En plus d’effectuer des recommandations qui sont transparentes et personnalisables, notre méthode, basée sur le contenu, n’est pas victime des problèmes dont souffrent les systèmes de filtrage collaboratif, comme le problème du démarrage à froid (cold start problem). Nous présentons ensuite un algorithme d’apprentissage automatique qui applique des étiquettes à des chansons à partir d’attributs extraits de leur fichier audio. L’ensemble de données que nous utilisons est construit à partir d’une très grande quantité de données sociales provenant du site Last.fm. Nous présentons finalement un algorithme de génération automatique de liste d’écoute personnalisable qui apprend un espace de similarité musical à partir d’attributs audio extraits de chansons jouées dans des listes d’écoute de stations de radio commerciale. En plus d’utiliser cet espace de similarité, notre système prend aussi en compte un nuage d’étiquettes que l’utilisateur est en mesure de manipuler, ce qui lui permet de décrire de manière abstraite la sorte de musique qu’il désire écouter. / This thesis is composed of three papers which unite under the general theme of large-scale music recommendation. The first paper presents a recommendation technique that works by collecting text descriptions of items and using this textual aura to compute the similarity between them using techniques drawn from information retrieval. We show how this representation can be used to explain the similarities between items using terms from the textual aura and further how it can be used to steer the recommender. Because our system is content-based, it is not victim of the usual problems associated with collaborative filtering recommenders like the cold start problem. The second paper presents a machine learning model which automatically applies tags to music. The model uses features extracted from the audio files and was trained on a very large data set constructed with social data from the online community Last.fm. The third paper presents an approach to generating steerable playlists. We first demonstrate a method for learning song transition probabilities from audio features extracted from songs played in professional radio station playlists. We then show that by using this learnt similarity function as a prior, we are able to generate steerable playlists by choosing the next song to play not simply based on that prior, but on a tag cloud that the user is able to manipulate to express the high-level characteristics of the music he wishes to listen to.
1142

TAARAC : test d'anglais adaptatif par raisonnement à base de cas

Lakhlili, Zakia January 2007 (has links)
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal.
1143

A hybrid prognostic methodology and its application to well-controlled engineering systems

Eker, Ömer F. January 2015 (has links)
This thesis presents a novel hybrid prognostic methodology, integrating physics-based and data-driven prognostic models, to enhance the prognostic accuracy, robustness, and applicability. The presented prognostic methodology integrates the short-term predictions of a physics-based model with the longer term projection of a similarity-based data-driven model, to obtain remaining useful life estimations. The hybrid prognostic methodology has been applied on specific components of two different engineering systems, one which represents accelerated, and the other a nominal degradation process. Clogged filter and fatigue crack propagation failure cases are selected as case studies. An experimental rig has been developed to investigate the accelerated clogging phenomena whereas the publicly available Virkler fatigue crack propagation dataset is chosen after an extensive literature search and dataset analysis. The filter clogging experimental rig is designed to obtain reproducible filter clogging data under different operational profiles. This data is thought to be a good benchmark dataset for prognostic models. The performance of the presented methodology has been evaluated by comparing remaining useful life estimations obtained from both hybrid and individual prognostic models. This comparison has been based on the most recent prognostic evaluation metrics. The results show that the presented methodology improves accuracy, robustness and applicability. The work contained herein is therefore expected to contribute to scientific knowledge as well as industrial technology development.
1144

Rupture d'interfaces en présence d'agents de surface

Roché, Matthieu 19 December 2008 (has links)
Le détachement d'une goutte est un phénomène que nous observons quotidiennement. Il résulte de la rupture de l'interface entre le fluide dispersé en goutte et le fluide environnant. Cette rupture a fait l'objet de nombreuses études. Il est bien établi que sa dynamique est régie par une compétition entre la capillarité, l'inertie, et la viscosité du fluide. Ce manuscrit décrit l'influence sur la dynamique de rupture d'une modification des propriétés de l'interface entre deux fluides à l'aide d'agents de surface. Lorsque l'agent de surface est un surfactant (SDS), la dynamique d'amincissement peut se faire selon deux modes. Deux régimes linéaires en temps constituent le premier mode. Le second mode comporte trois régimes linéaires. Dans les deux cas, l'aminicissement commence par un premier régime, suivi d'un deuxième régime de pente plus forte. Lorsque le troisième régime existe, sa pente est inférieure à celle du second régime. La variation des pentes des régimes linéaires témoigne du comportement dynamique du surfactant à l'interface. La valeur de la tension interfaciale $\gamma$ extraite du premier régime linéaire correspond à la valeur à l'équilibre de la tension interfaciale du système, $\gamma_{eq}$. La vitesse d'amincissement plus élevée au cours du second régime est reliée à une dépletion partielle en surfactant de la zone d'amincissement maximal. Le ralentissement constaté pendant le troisième régime est lié au déplacement de cette zone vers une région plus riche en surfactant, où la tension $\gamma$ est plus faible. La dynamique d'amincissement du cou est très différente lorsque des polymères de poids moléculaire intermédiaire ($\sim$ 100 kDa) sont présents simultanément avec du SDS dans la phase continue. Lorsque $C_{SDS}$ est supérieure à 0,15 fois la concentration micellaire critique (CMC), le comportement est identique à celui observé en présence de surfactant seul. En dessous de 0,15 CMC, l'amincissement ralentit exponentiellement à l'approche de la rupture, et un phénomène de beads-on-a-string apparaît. Ces constatations sont analogues à celles faites lorsqu'une solution de polymères est menée à la rupture. Dans notre cas, les polymères sont uniquement à la surface du jet et non dans son volume! Une analyse des profils du cou au cours du temps démontre l'existence d'une auto-similarité à l'approche de la rupture. Bien que les systèmes étudiés soient plus complexes, ils présentent des caractéristiques qualitativement analogues à celles observées dans des systèmes de fluides simples. Toutefois, il existe une grande différence quantitative. / Droplet detachment is ubiquitous in everyday life. It results from the rupture of an interface separating two fluids. This rupture has been widely studied. It is now well established that it relies on a competition between capillary, inertial and viscous phenomena. In this manuscript, we report on the influence on the breakup dynamics of the presence of surface agents at the interface. When SDS is used as a surface agent, thinning can proceed in two ways. In the first mode, the dynamics of thinning are characterized by two linear-in-time regimes. The second mode is made of three linear-in-time regimes. In both cases, thinning starts with a first regime, followed by a steeper second regime. When a third regime exists, its slope is softer. Slope variation bears witness to a dynamical behaviour of the surfactants at the interface. The value for the interfacial tension $\gamma$ calculated from the slope of the first linear regime is in agreement with the equilibrium interfacial tension of the system, $\gamma_{eq}$. The higher thinning speed during the second regime is linked to a partial depletion in surfactant of the maximal thinning zone. The slowdown in the tihrd regime is related to a displacement of the thinning zone in a region of higher surfactant concentration, where $\gamma$ is lower. The thinning dynamics is very different when polymers are added to the surfactant solution. If $C_{SDS}$ is higher than 0.15 times the critical micellar concentration (CMC), a behaviour similar to the pure-surfactant case is observed. Below 0.15 CMC, an exponential slowdown is observed in the last instants, as well as a "`beads-on-a-string"' phenomenon. These observations are analogous to what is seen when a solution of polymers is led to breakup. In our case, polymers are not in the bulk; they are at the interface of the two fluids! Analysis of the profiles of the neck in both cases showed that profiles are self-similar. Qualitatively, they share features with profiles observed in the case of breakup of interfaces between simple fluids. Quantitatively, slopes and angles are different.
1145

Acquisition et consolidation de représentations distribuées de séquences motrices, mesurées par IRMf

Pinsard, Basile 09 1900 (has links)
No description available.
1146

Triangular similarity metric learning : A siamese architecture approach / Apprentissage métrique de similarité triangulaire : Une approche d'architecture siamois

Zheng, Lilei 10 May 2016 (has links)
Dans de nombreux problèmes d’apprentissage automatique et de reconnaissance des formes, il y a toujours un besoin de fonctions métriques appropriées pour mesurer la distance ou la similarité entre des données. La fonction métrique est une fonction qui définit une distance ou une similarité entre chaque paire d’éléments d’un ensemble de données. Dans cette thèse, nous proposons une nouvelle methode, Triangular Similarity Metric Learning (TSML), pour spécifier une fonction métrique de données automatiquement. Le système TSML proposée repose une architecture Siamese qui se compose de deux sous-systèmes identiques partageant le même ensemble de paramètres. Chaque sous-système traite un seul échantillon de données et donc le système entier reçoit une paire de données en entrée. Le système TSML comprend une fonction de coût qui définit la relation entre chaque paire de données et une fonction de projection permettant l’apprentissage des formes de haut niveau. Pour la fonction de coût, nous proposons d’abord la similarité triangulaire (Triangular Similarity), une nouvelle similarité métrique qui équivaut à la similarité cosinus. Sur la base d’une version simplifiée de la similarité triangulaire, nous proposons la fonction triangulaire (the triangular loss) afin d’effectuer l’apprentissage de métrique, en augmentant la similarité entre deux vecteurs dans la même classe et en diminuant la similarité entre deux vecteurs de classes différentes. Par rapport aux autres distances ou similarités, la fonction triangulaire et sa fonction gradient nous offrent naturellement une interprétation géométrique intuitive et intéressante qui explicite l’objectif d’apprentissage de métrique. En ce qui concerne la fonction de projection, nous présentons trois fonctions différentes: une projection linéaire qui est réalisée par une matrice simple, une projection non-linéaire qui est réalisée par Multi-layer Perceptrons (MLP) et une projection non-linéaire profonde qui est réalisée par Convolutional Neural Networks (CNN). Avec ces fonctions de projection, nous proposons trois systèmes de TSML pour plusieurs applications: la vérification par paires, l’identification d’objet, la réduction de la dimensionnalité et la visualisation de données. Pour chaque application, nous présentons des expérimentations détaillées sur des ensembles de données de référence afin de démontrer l’efficacité de notre systèmes de TSML. / In many machine learning and pattern recognition tasks, there is always a need for appropriate metric functions to measure pairwise distance or similarity between data, where a metric function is a function that defines a distance or similarity between each pair of elements of a set. In this thesis, we propose Triangular Similarity Metric Learning (TSML) for automatically specifying a metric from data. A TSML system is loaded in a siamese architecture which consists of two identical sub-systems sharing the same set of parameters. Each sub-system processes a single data sample and thus the whole system receives a pair of data as the input. The TSML system includes a cost function parameterizing the pairwise relationship between data and a mapping function allowing the system to learn high-level features from the training data. In terms of the cost function, we first propose the Triangular Similarity, a novel similarity metric which is equivalent to the well-known Cosine Similarity in measuring a data pair. Based on a simplified version of the Triangular Similarity, we further develop the triangular loss function in order to perform metric learning, i.e. to increase the similarity between two vectors in the same class and to decrease the similarity between two vectors of different classes. Compared with other distance or similarity metrics, the triangular loss and its gradient naturally offer us an intuitive and interesting geometrical interpretation of the metric learning objective. In terms of the mapping function, we introduce three different options: a linear mapping realized by a simple transformation matrix, a nonlinear mapping realized by Multi-layer Perceptrons (MLP) and a deep nonlinear mapping realized by Convolutional Neural Networks (CNN). With these mapping functions, we present three different TSML systems for various applications, namely, pairwise verification, object identification, dimensionality reduction and data visualization. For each application, we carry out extensive experiments on popular benchmarks and datasets to demonstrate the effectiveness of the proposed systems.
1147

Adequando consultas por similaridade para reduzir a descontinuidade semântica na recuperação de imagens por conteúdo / Reducing the semantic gap content-based image retrieval with similarity queries

Razente, Humberto Luiz 31 August 2009 (has links)
Com o crescente aumento no número de imagens geradas em mídias digitais surgiu a necessidade do desenvolvimento de novas técnicas de recuperação desses dados. Um critério de busca que pode ser utilizado na recuperação das imagens é o da dissimilaridade, no qual o usuário deseja recuperar as imagens semelhantes à uma imagem de consulta. Para a realização das consultas são empregados vetores de características extraídos das imagens e funções de distância para medir a dissimilaridade entre pares desses vetores. Infelizmente, a busca por conteúdo de imagens em consultas simples tende a gerar resultados que não correspondem ao interesse do usuário misturados aos resultados significativos encontrados, pois em geral há uma descontinuidade semântica entre as características extraídas automaticamente e a subjetividade da interpretação humana. Com o intuito de tratar esse problema, diversos métodos foram propostos para a diminuição da descontinuidade semântica. O foco principal desta tese é o desenvolvimento de métodos escaláveis para a redução da descontinuidade semântica em sistemas recuperação de imagens por conteúdo em tempo real. Nesta sentido, são apresentados: a formalização de consultas por similaridade que permitem a utilização de múltiplos centros de consulta em espaços métricos como base para métodos de realimentação de relevância; um método exato para otimização dessas consultas nesses espaços; e um modelo para tratamento da diversidade em consultas por similaridade e heurísticas para sua otimização / The increasing number of images captured in digital media fostered the developmet of new methods for the recovery of these images. Dissimilarity is a criteria that can be used for image retrieval, where the results are images that are similar to a given reference. The queries are based on feature vectors automatically extracted from the images and on distance functions to measure the dissimilarity between pair of vectors. Unfortunately, the search for images in simple queries may result in images that do not fulfill the user interest together with meaningful images, due to the semantic gap between the image features and to the subjectivity of the human interpretation. This problem leaded to the development of many methods to deal with the semantic gap. The focus of this thesis is the development of scalable methods aiming the semantic gap reduction in real time for content-based image retrieval systems. For this purpose, we present the formal definition of similarity queries based on multiple query centers in metric spaces to be used in relevance feedback methods, an exact method to optimize these queries and a model to deal with diversity in nearest neighbor queries including heuristics for its optimization
1148

Perspectiva histórica e tecnológica da calibração do túnel 2 do Sistema Cantareira de adução de água para a região metropolitana de São Paulo / Historical and technological perspective of the test for the calibration of tunnel 2 of the Cantareira water project for the metropolitan area of São Paulo

Pinheiro, Hemerson Donizete 22 March 2007 (has links)
Trabalhos de cunho historiográfco que visam resgatar a produção científica e tecnológica de um país tem auxiliado a compreender e a classificar seu nível de desenvolvimento frente a estas questões. Com este trabalho, espera-se dar início no Departamento de Hidráulica e Saneamento (SHS) da Escola de Engenharia de São Carlos, a um levantamento historiográfico da produção tecnológica e científica de seus pós-graduandos e professores. Para tanto, resgata-se um trabalho realizado entre o final da década de 1960 e início da década de 1970, pela Cátedra de Mecânica dos Fluidos (precursora do SHS), que teve como objetivo prever a vazão de água do túnel 2 do Sistema Cantareira de abastecimento de água para a região metropolitana de São Paulo. Nesta galeria foi realizado um ensaio original com circulação de ar, para o qual foram desenvolvidos métodos e técnicas a fim de verificar a vazão e auxiliar no seu dimensionamento, para garantir uma adução de 33 \'M POT.3\'/s. Mediante o levantamento, organização cronológica e análise dos documentos produzidos pelos autores do referido ensaio, resgatam-se as influências teóricas que nortearam as metodologias, as técnicas e tecnologias e analisam-se, de acordo com os registros documentais, a execução e resultados alcançados pelos ensaios. / Works that have as objective to review the scientific and technological production of a country have helped to understand and to measure its level of development regarding these questions. With this work, hopefully a historical survey of the technological and scientific production of its graduate students and teachers begins at the Departamento de Hidráulica e Saneamento (SHS) at the Escola de Engenharia de São Carlos. In this way, a work that was carried out between the end of the 1960\'s and beginning of the 1970\'s, by the chair of fluid mechanics (precursor of the SHS), which had as objective to foresee the water flow rate at tunnel 2 of the Cantareira water project for the metropolitan area of São Paulo. An original test using air circulation was carried out in the tunnel. Methods and techniques were developed in order to verify the flow rate and check its size, to guarantee 33 \'M POT.3\'/s of flow. The documents produced by the authors of the test were surveyed, organized chronologically and analyzed, seeking to infer the theoretical influences that had guided the methodologies, to describe the techniques and technologies and to analyze, in accordance with the document registers, the execution and results reached from the tests.
1149

Segmentação da estrutura cerebral hipocampo por meio de nuvem de similaridade / Automatic hippocampus segmentation through similarity cloud

Athó, Fredy Edgar Carranza 03 August 2011 (has links)
O hipocampo é uma estrutura cerebral que possui importância primordial para o sistema de memória humana. Alterações no seus tecidos levam a doenças neurodegenerativas, tais como: epilepsia, esclerose múltipla e demência, entre outras. Para medir a atrofia do hipocampo é necessário isolá-lo do restante do cérebro. A separação do hipocampo das demais partes do cérebro ajuda aos especialistas na análise e o entendimento da redução de seu volume e detecção de qualquer anomalia presente. A extração do hipocampo é principalmente realizada de modo manual, a qual é demorada, pois depende da interação do usuário. A segmentação automática do hipocampo é investigada como uma alternativa para contornar tais limitações. Esta dissertação de mestrado apresenta um novo método de segmentação automático, denominado Modelo de Nuvem de Similaridade (Similarity Cloud Model - SimCM). O processo de segmentação é dividido em duas etapas principais: i) localização por similaridade e ii) ajuste de nuvem. A primeira operação utiliza a nuvem para localizar a posição mais provável do hipocampo no volume destino. A segunda etapa utiliza a nuvem para corrigir o delineamento final baseada em um novo método de cálculo de readequação dos pesos das arestas. Nosso método foi testado em um conjunto de 235 MRI combinando imagens de controle e de pacientes com epilepsia. Os resultados alcançados indicam um rendimento superior tanto em efetividade (qualidade da segmentação) e eficiência (tempo de processamento), comparado com modelos baseados em grafos e com modelos Bayesianos. Como trabalho futuro, pretendemos utilizar seleção de características para melhorar a construção da nuvem e o delineamento dos tecidos / The hippocampus is a particular structure that plays a main role in human memory systems. Tissue modifications of the hippocampus lead to neurodegenerative diseases as epilepsy, multiple sclerosis, and dementia, among others. To measure hippocampus atrophy, it is crucial to get its isolated representation from the whole brain volume. Separating the hippocampus from the brain helps physicians in better analyzing and understanding its volume reduction, and detecting any abnormal behavior. The extraction of the hippocampus is dominated by manual segmentation, which is time consuming mainly because it depends on user interaction. Therefore, automatic segmentation of the hippocampus has being investigated as an alternative solution to overcome such limitations. This master dissertation presents a new automatic segmentation method called Similarity Cloud Model (SimCM) based on hippocampus feature extraction. The segmentation process consists of two main operations: i) localization by similarity, and ii) cloud adjustment. The first operation uses the cloud to localize the most probable position of the hippocampus in a target volume. The second process invokes the cloud to correct the final labeling, based on a new method for arc-weight re-adjustment. Our method has been tested in a dataset of 235 MRIs combining healthy and epileptic patients. Results indicate superior performance, in terms of effectiveness (segmentation quality) and efficiency (processing time), in comparison with similar graph-based and Bayesian-based models. As future work, we intend to use feature selection to improve cloud construction and tissue delineation
1150

Sistematização da percepção médica na construção de sistemas para recuperação de imagens por conteúdo / Systematization of medical perception in implementing of content-based image retrieval systems

Silva, Marcelo Ponciano da 27 February 2014 (has links)
Nos últimos anos o mundo tem vivenciado uma avalanche de novas tecnologias para auxílio ao diagnóstico médico. Esses esforços buscam um diagnóstico rápido e preciso através de exames e informações sobre a condição física do paciente. Através do uso de imagens médicas, a radiologia busca a visualização de órgãos ou estruturas internas do corpo humano para encontrar respostas às suspeitas de problemas físicos expressos por sinais e sintomas relatados pelo paciente. Nessa área, os Sistemas de Comunicação e Armazenamento de Imagens (PACS) têm ajudado no armazenamento e organização do crescente número de imagens geradas pelos exames realizados nos hospitais. Trabalhos de pesquisa médica têm evidenciado o potencial de uso dessas imagens como auxílio à prática da Medicina Baseada em Casos Similares (MBCS). Por esse motivo, há na literatura um esforço contínuo em desenvolver técnicas computacionais para recuperação de imagens baseada em conteúdos similares (CBIR) em grandes conjuntos de dados. As consultas por similaridade são essenciais para apoiar a prática da MBCS e a descoberta de comportamentos de lesões causadas por diversas doenças. A evolução e intensificação das pesquisas em CBIR têm encontrado vários desafios. Um desses é a divergência entre os resultados obtidos automaticamente e aqueles esperados pelos radiologistas (descontinuidade semântica). Outro desafio é a falta de estudos sobre a viabilidade clínica dessas ferramentas como forma de auxílio ao diagnóstico. Esses obstáculos são dois dos principais responsáveis pela não efetivação dessa tecnologia no ambiente médico-hospitalar. Mediante o exposto acima, este trabalho de pesquisa propõe um mecanismo para contornar essa descontinuidade semântica e ao mesmo tempo aproximar o CBIR do ambiente real de aplicação. A contribuição principal deste trabalho foi o desenvolvimento de uma metodologia baseada em parâmetros perceptuais que aproximam o sistema ao nível de percepção do usuário médico. Em seguida, foi realizado um estudo sobre a viabilidade clínica do sistema CBIR no Hospital das Clínicas de Ribeirão Preto. A metodologia proposta foi aplicada e os resultados comprovaram a aplicabilidade de Sistemas CBIR como ferramenta de auxílio ao diagnóstico em um ambiente clínico real / In recent years the world has experienced an avalanche of new technologies to aid medical diagnosis. These efforts seek a quick and accurate diagnosis through exams and information about the patient\'s physical condition. The radiology studies the visualization of the organs or structures through the use of images. In this area, the Picture Archiving and Communication Systems (PACS) have helped in the storage and organization of the growing number of images generated by exams performed in hospitals. Medical research papers have shown the potential use of these images as an aid to the Similar Case-Based Reasoning (SCBR) practice in Medicine. For this reason, there is an ongoing effort in the literature to develop computational techniques for Content-Based Image Retrieval (CBIR) in large data sets. Similarity queries are essential to support the practice of SCBR. The evolution and intensification of research in CBIR have encountered several challenges. One of these is the discrepancy between the results obtained automatically and those expected by radiologists (semantic gap). Another challenge is the lack of studies on the clinical viability of these tools as a way to assist in diagnosis. These obstacles are the two main responsible for reservation in using this technology in the medical hospital environment. Considering this scenario, this research proposes a mechanism to overcome this semantic gap and bring the real environment to the CBIR application. The main contribution for this research was the development of a methodology based on Perceptual Parameters to approximate the system to the level of user perception. Then we conducted a study on the clinical viability of a CBIR system at the Clinical Hospital of the University of São Paulo at Ribeirão Preto. The proposed methodology was applied and the results showed the applicability of CBIR systems as a computer aided diagnosis tool in a real clinical environment

Page generated in 0.0575 seconds