• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 72
  • 35
  • 12
  • 9
  • 5
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 156
  • 58
  • 37
  • 33
  • 30
  • 30
  • 27
  • 27
  • 27
  • 23
  • 19
  • 19
  • 18
  • 17
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Assessing Viability of Open-Source Battery Cycling Data for Use in Data-Driven Battery Degradation Models

Ritesh Gautam (17582694) 08 December 2023 (has links)
<p dir="ltr">Lithium-ion batteries are being used increasingly more often to provide power for systems that range all the way from common cell-phones and laptops to advanced electric automotive and aircraft vehicles. However, as is the case for all battery types, lithium-ion batteries are prone to naturally occurring degradation phenomenon that limit their effective use in these systems to a finite amount of time. This degradation is caused by a plethora of variables and conditions including things like environmental conditions, physical stress/strain on the body of the battery cell, and charge/discharge parameters and cycling. Accurately and reliably being able to predict this degradation behavior in battery systems is crucial for any party looking to implement and use battery powered systems. However, due to the complicated non-linear multivariable processes that affect battery degradation, this can be difficult to achieve. Compared to traditional methods of battery degradation prediction and modeling like equivalent circuit models and physics-based electrochemical models, data-driven machine learning tools have been shown to be able to handle predicting and classifying the complex nature of battery degradation without requiring any prior knowledge of the physical systems they are describing.</p><p dir="ltr">One of the most critical steps in developing these data-driven neural network algorithms is data procurement and preprocessing. Without large amounts of high-quality data, no matter how advanced and accurate the architecture is designed, the neural network prediction tool will not be as effective as one trained on high quality, vast quantities of data. This work aims to gather battery degradation data from a wide variety of sources and studies, examine how the data was produced, test the effectiveness of the data in the Interfacial Multiphysics Laboratory’s autoencoder based neural network tool CD-Net, and analyze the results to determine factors that make battery degradation datasets perform better for use in machine learning/deep learning tools. This work also aims to relate this work to other data-driven models by comparing the CD-Net model’s performance with the publicly available BEEP’s (Battery Evaluation and Early Prediction) ElasticNet model. The reported accuracy and prediction models from the CD-Net and ElasticNet tools demonstrate that larger datasets with actively selected training/testing designations and less errors in the data produce much higher quality neural networks that are much more reliable in estimating the state-of-health of lithium-ion battery systems. The results also demonstrate that data-driven models are much less effective when trained using data from multiple different cell chemistries, form factors, and cycling conditions compared to more congruent datasets when attempting to create a generalized prediction model applicable to multiple forms of battery cells and applications.</p>
102

Bullying Detection through Graph Machine Learning : Applying Neo4j’s Unsupervised Graph Learning Techniques to the Friends Dataset

Enström, Olof, Eid, Christoffer January 2023 (has links)
In recent years, the pervasive issue of bullying, particularly in academic institutions, has witnessed a surge in attention. This report centers around the utilization of the Friends Dataset and Graph Machine Learning to detect possible instances of bullying in an educational setting. The importance of this research lies in the potential it has to enhance early detection and prevention mechanisms, thereby creating safer environments for students. Leveraging graph theory, Neo4j, Graph Data Science Library, and similarity algorithms, among other tools and methods, we devised an approach for processing and analyzing the dataset. Our method involves data preprocessing, application of similarity and community detection algorithms, and result validation with domain experts. The findings of our research indicate that Graph Machine Learning can be effectively utilized to identify potential bullying scenarios, with a particular focus on discerning community structures and their influence on bullying. Our results, albeit preliminary, represent a promising step towards leveraging technology for bullying detection and prevention.
103

Évaluation de programmes de prétraitement de signal d'activité électrodermale (EDA)

DeRoy, Claudéric 08 1900 (has links)
Lien vers le GitHub contenant tous les outils programmés dans le cadre du mémoire : https://github.com/neurok8050/eda-optimisation-processing-tool / L’activité électrodermale (EDA), particulièrement la skin conductance response (SCR), est un signal psychophysiologique fréquemment utilisé en recherche en psychologie et en neuroscience cognitive. L’utilisation de l’EDA entraîne son lot de défis particulièrement son prétraitement. En effet, encore très peu de recherches effectuent un prétraitement adéquat. Notre objectif est donc de promouvoir l’utilisation du prétraitement du signal SCR et de proposer des recommandations pour les chercheurs en fournissant des données sur l’impact du prétraitement sur la capacité à discriminer les SCR entre deux conditions expérimentales. En utilisant des travaux similaires, nous avons testé les effets de combinaisons de prétraitement utilisant différentes méthodes de filtrage, différentes méthodes de remise à l’échelle, l’inclusion d’une étape de détection automatique des artefacts de mouvement et en utilisant différentes métriques opérationnalistes (le peak-scoring (PS) et l’aire sous la courbe (AUC)) et d’approches par modèle. Enfin, nous avons testé si une seule combinaison de filtrage pourrait être utilisée avec différents jeux de données ou si le prétraitement devrait plutôt être ajusté individuellement à chaque jeu de données. Nos résultats suggèrent que 1) l’inclusion d’une étape de détection automatique des artefacts de mouvements n’affecte pas significativement la capacité à discriminer entre deux conditions expérimentales, 2) l’approche par modèle semble être un peu meilleure à discriminer entre deux conditions expérimentales et 3) la meilleure combinaison de prétraitement semble variée en fonction du jeu de données utilisé. Les données et outils présentés dans ce mémoire devraient permettre de promouvoir et faciliter le prétraitement du signal SCR. / Electrodermal activity (EDA), particularly the skin conductance response (SCR) is a psychophysiological signal frequently used in research in psychology and in cognitive neuroscience. Nevertheless, using EDA comes with some challenges notably in regard to its preprocessing. Indeed, very few research teams adequately preprocess their data. Our objective is to promote the implementation of SCR preprocessing and to offer some recommendations to researchers by providing some data on the effect of preprocessing on the SCR ability to discriminate between two experimental conditions. Based on similar work, we have tested the effect of preprocessing combinations using different filtering methods, different rescaling methods, the inclusion of an automatic motion detection step while using different operationalist metrics (peak-scoring (PS) and area under the curve (AUC)) and different model-based approach metrics. Finally, we tested if only one combination could be used across different datasets or if the preprocessing should be optimized individually to each dataset. Our results show that 1) the inclusion of the automatic motion detection step did not significantly impact the ability to discriminate between two experimental conditions, 2) the model-based approach seems to be slightly better at discriminating between two experimental conditions and 3) the best combination of preprocessing seems to vary between different datasets. The data and tools presented in this master thesis should promote and facilitate SCR signal preprocessing.
104

API data gathering and structuring for machine learning and human use : Optimizing API data for both financial machine learning and being easy to read and use by the end user / API data insamling och strukturering för maskininlärning och människa : Optimisterna API data för både finansiell maskininlärning och enkelt att läsa och använda för användaren

Forshällen, Axel January 2022 (has links)
This thesis looks into how to implement an abstraction layer between transaction data gathered from Revised Payment Services Directive (PSD2) compliant banks via an Application Programming Interface (API) and a data base, with a human user interface for reading and structuring the data. API for data sources tend to not have a standardized structure and this creates a problem for machine learning. The result is that the machine learning component either has to be built around the data collected from the API or the data to be transformed and reformatted to fit the structure of the machine learning component's database. An application will use the abstraction layer to fetch data and to allow the user to set up how the data should be reformatted before being sent to the machine learning component's database. The application has to display the data in an easy to read format and the application needs to be usable by a human user. The main questions are (i) how this abstraction should be implemented, (ii) how much of it can be automated, and (iii) what is the optimal design for the components. PSD2 open banking systems in Sweden are using Representational State Transfer (REST) API and provide data in the JavaScript Object Notation (JSON) format, and can be considered the de facto standard. The abstractions can be divided into three areas: Authorization, Account and transaction access, and Transaction data. Out of these areas only the transaction data could be abstracted fully. The account and transaction access process could be partly abstracted while the authorization process can only be broken down into steps as there is no possibility of abstracting the parameters used by the different banks. The project aimed to produce a fully functioning application for gathering data via PSD2 open banking where the user can configure the application through a simple system that does not require the user to have a lot of knowledge about coding. While the process of fetching transaction data from PSD2 API is simplified, the goal of being useful to a person without knowledge of coding is currently impossible unless PSD2 open banking is standardized or more advanced tools are used. / Den här uppsatsen undersöker hur man kan implementera ett abstraktionslager mellan transaktionsdata samlat från Revised Payment Services Directive (PSD2) kompatibla banker via en Application Programming Interface (API) och en databas, med en gränssnitt för människor att använda för att läsa och strukturera data. APIer för datakällor tenderar mot att inte ha en standardiserad struktur och det skapar problem för maskininlärning. Det resulterar i att maskininlärningskomponenten måste antingen byggas runt datan som hämtas från APIer eller att datan transformeras och oformaterad för att passa strukturen för maskininlärningskomponentens databas. Applikation behöver visa datan på ett format som är enkelt att läsa och vara lätt för en människa att använda. Huvudfrågorna är (i) hur abstraktionen ska implementeras, (ii) hur mycket som kan automatiseras, och (iii) vad optimala designen är för komponenterna. PSD2 open banking system i Sverige använder sig av Representational State Transfer (REST) APIer och ger data i JavaScript Object Notation (JSON) formatet och kan anses som en de facto standard. Abstraktionerna kan delas in i tre områden: auktorisering, tillgång till konton och transaktioner, och transaktionsdata. Av dessa tre områden så var det endast transaktionsdata som kunde fullt abstrakternas. Processen för tillgång till konton och transaktioner kunde delvis abstrakternas medan auktorisering kunde endast brytas ner i steg eftersom det inte finns någon möjlighet att abstraktera parametrarna som används av olika banker. Det här projektet försökte producera en fullt fungerande applikation för att samla data via PSD2 kompatibla open banking system där användaren kan konfigurera applikationen genom ett simpelt system där användaren inte skulle behöva erfarenhet om kodning. Processen för att hämta transaktionsdata från PSD2 APIer kan förenklas, men målet var att det skulle vara användbart för en person som inte kan programmering är omöjligt att nå om PSD2 open banking inte standardiseras eller mer avancerade verktyg används.
105

Advanced Color Projector Design Based on Human Visual System

Thakur, Mahesh Kumar Singh January 2011 (has links)
No description available.
106

On the development of an open-source preprocessing framework for finite element simulations

Alexandra D Mallory (6640721) 14 May 2019 (has links)
Computational modeling is essential for material and structural analyses for a multitude of reasons, including for the improvement of design and reducing manufacturing costs. However, the cost of commercial finite element packages prevent companies with limited financial resources from accessing them. Free finite element solvers, such as Warp3D, exist as robust alternatives to commercial finite element analysis (FEA) packages. This and other open-source finite element solvers are not necessarily easy to use. This is mainly due to a lack of a preprocessing framework, where users can generate meshes, apply boundary conditions and forces, or define materials. We developed a preprocessor for Warp3d, which is referred to as <i>W3DInput</i>, to generate input files for the processor. <i>W3DInput</i> creates a general framework, at no cost, to go from CAD models to structural analysis. With this preprocessor, the user can import a mesh from a mesh generator software – for this project, Gmsh was utilized – and the preprocessor will step the user through the necessary inputs for a Warp3D file. By using this preprocessor, the input file is guaranteed to be in the correct order and format that is readable by the solver, and makes it more accessible for users of all levels. With this preprocessor, five use cases were created: a cantilever beam, a displacement control test, a displacement control test with a material defined by a user-defined stress-strain curve, a crystal plasticity model, and pallet. Results were outputted to Exodus II files for viewing in Paraview, and the results were verified by checking the stress-strain curves. Results from these use cases show that the input files generated from the preprocessor functions were correct.
107

Métodos de pré-processamento de texturas para otimizar o reconhecimento de padrões / Texture preprocessing methods to optimize pattern recognition

Neiva, Mariane Barros 19 July 2016 (has links)
A textura de uma imagem apresenta informações importantes sobre as características de um objeto. Usar essa informação para reconhecimento de padrões vem sendo uma tarefa bastante pesquisada na área de processamento de imagens e aplicado em atividades como indústria têxtil, biologia, análise de imagens médicas, imagens de satélite, análise de peças industriais, entre outros. Muitos pesquisadores focam em criar mecanismos que convertam a imagem em um vetor de características a fim de utilizar um classificador sobre esse vetores. No entanto, as imagens podem ser transformadas para que que características peculiares sejam evidenciadas fazendo com que extratores de características já existentes explorem melhor as imagens. Esse trabalho tem como objetivo estudar a influência da aplicação de métodos de pré-processamento em imagens de textura para a posterior análise das imagens. Os métodos escolhidos são seis: difusão isotrópica, difusão anisotrópica clássica, dois métodos de regularização da difusão anisotrópica, um método de difusão morfológica e a transformada de distância. Além disso, os métodos foram aliados a sete descritores já conhecidos da literatura para que as características das imagens tranformadas sejam extraídas. Resultados mostram um aumento significativo no desempenho dos classificadores KNN e Naive Bayes quando utilizados nas imagens transformadas de quatro bases de textura: Brodatz, Outex, Usptex e Vistex. / The texture of an image plays an important source of information of the image content. The use of this information to pattern recognition became very popular in image processing area and has applications such in textile industry, biology, medical image analysis, satelite images analysis, industrial equipaments analysis, among others. Many researchers focus on creating different methods to convert the input image to a feature vector to the able to classify the image based on these vectors. However, images can be modified in different ways such that important features are enhanced. Therefore, descriptors are able to extract features easily to perform a better representation of the image. This project aims to apply six different preprocessing methods to analyze their power of enhancement on the texture extraction. The methods are: isotropic diffusion, the classic anisotropic diffusion, two regularizations of the anisotropic diffusion, a morphologic diffusion and the distance transform. To extract the features of these modified images, seven texture analysis algorithms are used along KNN and Naive Bayes to classify the textures. Results show a significant increase when datasets Brodatz, Vistex, Usptex and Outex are transformed prior to texture analysis and classification.
108

Conception d'une architecture embarquée adaptable pour le déploiement d'applications d'interface cerveau machine / Design of an adaptable embedded architecture for the deployment of brain-machine interface applications

Belwafi, Kais 28 September 2017 (has links)
L'objectif de ces travaux de recherche est l'étude et le développement d'un système ICM embarqué en utilisant la méthodologie de conception conjointe afin de satisfaire ses contraintes spécifiques. Il en a découlé la constitution d'un système ICM complet intégrant un système d'acquisition OpenBCI et un système de traitement à base de FPGA. Ce système pourrait être utilisé dans des contextes variés : médicale (pour les diagnostiques précoces des pathologies), technologique (informatique ubiquitaire), industriel (communication avec des robots), ludique (contrôler un joystick dans les jeux vidéo), etc. Dans notre contexte d’étude, la plateforme ICM proposée a été réalisée pour assister les personnes à mobilité réduite à commander les équipements domestiques. Nous nous sommes intéressés en particulier à l'étude et à l'implémentation des modules de filtrage adaptatif et dynamique, sous forme d'un coprocesseur codé en HDL afin de réduire son temps d'exécution car c'est le bloc le plus critique de la chaine ICM. Quant aux algorithmes d'extraction des caractéristiques et de classification, ils sont exécutés par le processeur Nios-II sous son système d'exploitation en ANSI-C. Le temps de traitement d'un trial par notre système ICM réalisé est de l'ordre de 0.4 s/trial et sa consommation ne dépasse guère 0.7 W. / The main purpose of this thesis is to study and develop an embedded brain computer interface (BCI) system using HW/SW methodology in order to satisfy the system specifications. A complete BCI system integrated in an acquisition system (OpenBCI) and a hardware platform based on the FPGA were achieved. The proposed system can be used in a variety of contexts: medical (for early diagnosis of pathologies, assisting people with severe disabilities to control home devices system through thought), technological (ubiquitous computing), industrial (communication with Robots), games (control a joystick in video games), etc. In our study, the proposed ICM platform was designed to control home devices through the thought of people with severe disabilities. A particular attention has been given to the study and implementation of the filtering module, adaptive and dynamic filtering, in the form of a coprocessor coded in HDL in order to reduce its execution time as it is the critical block in the returned ICM algorithms. For the feature extraction and classification algorithms, they are executed in the Nios-II processor using ANSI-C language. The prototype operates at 200 MHz and performs a real time classification with an execution delay of 0.4 second per trial. The power consumption of the proposed system is about 0.7 W.
109

Localisation et caractérisation du déroulement de la crise d'épilepsie temporale / Localization and characterization of the seizure development of temporal lobe epilepsy

Vélez-Pérez, Hugo Abraham 21 October 2010 (has links)
L’électroencéphalogramme (EEG) est un examen incontournable pour le diagnostic, la définition des structures cérébrales responsables de l’origine de crises et la classification des épilepsies. Cependant les enregistrements recueillis à la surface du scalp sont très perturbés par des artefacts et du bruit, ce qui complique considérablement l’interprétation clinique ou l’analyse automatique.Ce travail a pour objectif d’extraire des descripteurs des signaux d’EEG de surface qui peuvent conduire à la caractérisation de la dynamique spatio-temporelle des crises partielles du lobe temporal. Les estimateurs de relations inter-voies appliqués sont les méthodes linéaires paramétriques symétriques et non symétriques telles que l’inter-spectre (S), la cohérence (C), la Directed Transfert Function (DTF) ou la Partial Directed Coherence (PDC). Les relations sont estimées sur des EEG réels contenant une crise. La détection de fortes relations inter-voies est exploitée pour latéraliser puis caractériser la crise. Toutes les méthodes sont appliquées sur des signaux EEG bruts et prétraités. Une étape de prétraitement basée sur la séparation et classification de sources et le débruitage est mise en œuvre afin d’éliminer les artefacts et le bruit avec une perte minimale d’information en diminuant le risque de fausses détections de relations de connectivité inter-signaux. Les résultats obtenus sur 51 crises montrent que le prétraitement améliore la détection et le taux de bonnes latéralisations. Une méthode de couplage entre l’IS et les méthodes paramétriques directives (PDC et DTF) permet d’améliorer la caractérisation des crises / The electroencephalogram (EEG) is the essential clinical examination for the diagnosis, the definition of brain structures responsible of seizures and epilepsy classification. However, the signals collected on the surface of the scalp are very disturbed by artifacts and noise, which complicates the clinical interpretation or the automatic analysis. This work aims to extract descriptors of surface EEG signals that can lead to the spatio-temporal characterization of the temporal lobe seizures. The inter-channel relationship estimators applied are parametric linear methods, such as cross-spectrum (S), coherence (C), Directed Transfer Function (DTF) or Partial Directed Coherence (PDC). Relations are estimated on real EEG recordings containing a crisis. The detection of strong inter-channel relationships is exploited in order to lateralize and to characterize seizures. All methods are applied to raw and preprocessed EEG signals. A preprocessing step, based on the separation and classification of sources and denoising is implemented to remove artifacts and noise with a minimal loss of information by reducing the risk of false detections of inter-signal connectivity relationships. The results on 51 crises show that a signal preprocessing improves the detection and the rate of correct lateralization. A coupling method between S and directivity parametric methods (PDC and DTF) improves the characterization of crises
110

Métodos de pré-processamento de texturas para otimizar o reconhecimento de padrões / Texture preprocessing methods to optimize pattern recognition

Mariane Barros Neiva 19 July 2016 (has links)
A textura de uma imagem apresenta informações importantes sobre as características de um objeto. Usar essa informação para reconhecimento de padrões vem sendo uma tarefa bastante pesquisada na área de processamento de imagens e aplicado em atividades como indústria têxtil, biologia, análise de imagens médicas, imagens de satélite, análise de peças industriais, entre outros. Muitos pesquisadores focam em criar mecanismos que convertam a imagem em um vetor de características a fim de utilizar um classificador sobre esse vetores. No entanto, as imagens podem ser transformadas para que que características peculiares sejam evidenciadas fazendo com que extratores de características já existentes explorem melhor as imagens. Esse trabalho tem como objetivo estudar a influência da aplicação de métodos de pré-processamento em imagens de textura para a posterior análise das imagens. Os métodos escolhidos são seis: difusão isotrópica, difusão anisotrópica clássica, dois métodos de regularização da difusão anisotrópica, um método de difusão morfológica e a transformada de distância. Além disso, os métodos foram aliados a sete descritores já conhecidos da literatura para que as características das imagens tranformadas sejam extraídas. Resultados mostram um aumento significativo no desempenho dos classificadores KNN e Naive Bayes quando utilizados nas imagens transformadas de quatro bases de textura: Brodatz, Outex, Usptex e Vistex. / The texture of an image plays an important source of information of the image content. The use of this information to pattern recognition became very popular in image processing area and has applications such in textile industry, biology, medical image analysis, satelite images analysis, industrial equipaments analysis, among others. Many researchers focus on creating different methods to convert the input image to a feature vector to the able to classify the image based on these vectors. However, images can be modified in different ways such that important features are enhanced. Therefore, descriptors are able to extract features easily to perform a better representation of the image. This project aims to apply six different preprocessing methods to analyze their power of enhancement on the texture extraction. The methods are: isotropic diffusion, the classic anisotropic diffusion, two regularizations of the anisotropic diffusion, a morphologic diffusion and the distance transform. To extract the features of these modified images, seven texture analysis algorithms are used along KNN and Naive Bayes to classify the textures. Results show a significant increase when datasets Brodatz, Vistex, Usptex and Outex are transformed prior to texture analysis and classification.

Page generated in 0.0337 seconds