• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 234
  • 16
  • 6
  • 5
  • 5
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 306
  • 166
  • 147
  • 126
  • 107
  • 106
  • 86
  • 78
  • 74
  • 72
  • 62
  • 56
  • 56
  • 55
  • 53
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Prediction of Covid'19 Cases using LSTM

Tanveer, Hafsa January 2021 (has links)
No description available.
22

Interpretable serious event forecasting using machine learning and SHAP

Gustafsson, Sebastian January 2021 (has links)
Accurate forecasts are vital in multiple areas of economic, scientific, commercial, and industrial activity. There are few previous studies on using forecasting methods for predicting serious events. This thesis set out to investigate two things, firstly whether machine learning models could be applied to the objective of forecasting serious events. Secondly, if the models could be made interpretable. Given these objectives, the approach was to formulate two forecasting tasks for the models and then use the Python framework SHAP to make them interpretable. The first task was to predict if a serious event will happen in the coming eight hours. The second task was to forecast how many serious events that will happen in the coming six hours. GBDT and LSTM models were implemented, evaluated, and compared on both tasks. Given the problem complexity of forecasting, the results match those of previous related research. On the classification task, the best performing model achieved an accuracy of 71.6%, and on the regression task, it missed by less than 1 on average. / Exakta prognoser är viktiga inom flera områden av ekonomisk, vetenskaplig, kommersiell och industriell verksamhet. Det finns få tidigare studier där man använt prognosmetoder för att förutsäga allvarliga händelser. Denna avhandling syftar till att undersöka två saker, för det första om maskininlärningsmodeller kan användas för att förutse allvarliga händelser. För det andra, om modellerna kunde göras tolkbara. Med tanke på dessa mål var metoden att formulera två prognosuppgifter för modellerna och sedan använda Python-ramverket SHAP för att göra dem tolkbara. Den första uppgiften var att förutsäga om en allvarlig händelse kommer att ske under de kommande åtta timmarna. Den andra uppgiften var att förutse hur många allvarliga händelser som kommer att hända under de kommande sex timmarna. GBDT- och LSTM-modeller implementerades, utvärderades och jämfördes för båda uppgifterna. Med tanke på problemkomplexiteten i att förutspå framtiden matchar resultaten de från tidigare relaterad forskning. På klassificeringsuppgiften uppnådde den bäst presterande modellen en träffsäkerhet på 71,6%, och på regressionsuppgiften missade den i genomsnitt med mindre än 1 i antal förutspådda allvarliga händelser.
23

Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning

Lin, Chiao-Shing 03 March 2022 (has links)
The space of hand gesture recognition using radar and sonar is dominated mostly by radar applications. In addition, the machine learning algorithms used by these systems are typically based on convolutional neural networks with some applications exploring the use of long short term memory networks. The goal of this study was to build and design a Sonar system that can classify hand gestures using a machine learning approach. Secondly, the study aims to compare convolutional neural networks to long short term memory networks as a means to classify hand gestures using sonar. A Doppler Sonar system was designed and built to be able to sense hand gestures. The Sonar system is a multi-static system containing one transmitter and three receivers. The sonar system can measure the Doppler frequency shifts caused by dynamic hand gestures. Since the system uses three receivers, three different Doppler frequency channels are measured. Three additional differential frequency channels are formed by computing the differences between the frequency of each of the receivers. These six channels are used as inputs to the deep learning models. Two different deep learning algorithms were used to classify the hand gestures; a Doppler biLSTM network [1] and a CNN [2]. Six basic hand gestures, two in each x- y- and z-axis, and two rotational hand gestures are recorded using both left and right hand at different distances. The gestures were also recorded using both left and right hands. Ten-Fold cross-validation is used to evaluate the networks' performance and classification accuracy. The LSTM was able to classify the six basic gestures with an accuracy of at least 96% but with the addition of the two rotational gestures, the accuracy drops to 47%. This result is acceptable since the basic gestures are more commonly used gestures than rotational gestures. The CNN was able to classify all the gestures with an accuracy of at least 98%. Additionally, The LSTM network is also able to classify separate left and right-hand gestures with an accuracy of 80% and The CNN with an accuracy of 83%. The study shows that CNN is the most widely used algorithm for hand gesture recognition as it can consistently classify gestures with various degrees of complexity. The study also shows that the LSTM network can also classify hand gestures with a high degree of accuracy. More experimentation, however, needs to be done in order to increase the complexity of recognisable gestures.
24

Edge Caching for Small Cell Networks

Pervej, Md Ferdous 01 August 2019 (has links)
An idea of storing contents, such as media files, music files, movie clips, etc. is simple yet challenging in terms of required effort to make it count. Some of the benefits of pre-storing the contents are reduced delay of accessing/downloading a content, reduced load to the centralized servers and of course, a higher data rate. However, several challenges need to be addressed to achieve these benefits. Among many, some of the fundamentals are limited storage capacity, storing the right content and minimizing the costs. This thesis aims to address these challenges. First, a framework for predicting the proper contents that need to be stored to the limited storage capacity is presented. Then, the cost is minimized considering several real-world scenarios. While doing that, all possible collaborations among the local nodes are performed to ensure high performance. Therefore, the goal of this thesis is to come up with a solution to the content storing problems so that the network cost is minimized.
25

Consistent and Accurate Face Tracking and Recognition in Videos

Liu, Yiran 23 September 2020 (has links)
No description available.
26

A Preliminary Investigation into using Artificial Neural Networks to Generate Surgical Trajectories to Enable Semi-Autonomous Surgery in Space

Korte, Christopher M. 15 October 2020 (has links)
No description available.
27

Prediction of nickel product prices with LSTM

Rosendahl, Daniella January 2023 (has links)
Prediction of future stock markets has long been, and will continue to be a relevant topic. However, predicting markets is one of the most challenging areas to work with due to the unpredictability of the market. The extent to which markets can be predicted is a debated subject that has not yet been answered. A common approach is to use machine learning in combination with historical data to predict future stock prices. In this report, a classical machine learning method, LSTM, will be applied to nickel product prices to predict future product prices. The data used is provided by the company Harald Pihl, which has been trading various metals since the early 1900s. As a comparative material, the method is also applied to data on the nickel futures market. The results conclude that a larger number of data points are required for the prediction of nickel products to generate a credible result. In addition to this, there is a significant variation in the quality of the results depending on the dataset being used. The difference in results is due, among other things, to the number of data points, fluctuations in the dataset, and the regularity of the dataset.
28

AI-augmented analysis onto the impact of the containment strategies and climate change to pandemic

Dong, Shihao January 2023 (has links)
This thesis uses a multi-tasking long short-term memory (LSTM) model to investigate the correlation between containment strategies, climate change, and the number of COVID-19 transmissions and deaths. The study focuses on examining the accuracy of different factors in predicting the number of daily confirmed cases and deaths cases to further explore the correlation between different factors and cases. The initial assessment results suggest that containment strategies, specifically vaccination policies, have a more significant impact on the accuracy of predicting daily confirmed cases and deaths from COVID-19 compared to climate factors such as the daily average surface 2-meter temperature. Additionally, the study reveals that there are unpredictable effects on predictive accuracy resulting from the interactions among certain impact factors. However, the lack of interpretability of deep learning models poses a significant challenge for real-world applications. This study provides valuable insights into understanding the correlation between the number of daily confirmed cases, daily deaths, containment strategies, and climate change, and highlights areas for further research. It is important to note that while the study reveals a correlation, it does not imply causation, and further research is needed to understand the trends of the pandemic.
29

Human gait movement analysis using wearable solutions and Artificial Intelligence

Davarzani, Samaneh 09 December 2022 (has links) (PDF)
Gait recognition systems have gained tremendous attention due to its potential applications in healthcare, criminal investigation, sports biomechanics, and so forth. A new solution to gait recognition tasks can be provided by wearable sensors integrated in wearable objects or mobile devices. In this research a sock prototype designed with embedded soft robotic sensors (SRS) is implemented to measure foot ankle kinematic and kinetic data during three experiments designed to track participants’ feet ankle movement. Deep learning and statistical methods have been employed to model SRS data against Motion capture system (MoCap) to determine their ability to provide accurate kinematic and kinetic data using SRS measurements. In the first study, the capacitance of SRS related to foot-ankle basic movements was quantified during the gait movements of twenty participants on a flat surface and a cross-sloped surface. I have conducted another study regarding kinematic features in which deep learning models were trained to estimate the joint angles in sagittal and frontal planes measured by a MoCap system. Participant-specific models were established for ten healthy subjects walking on a treadmill. The prototype was tested at various walking speeds to assess its ability to track movements for multiple speeds and generalize models for estimating joint angles in sagittal and frontal planes. The focus of the last study is measuring the kinetic features and the goal is determining the validity of SRS measurements, to this end the pressure data measured with SRS embedded into the sock prototype would be compared with the force plate data.
30

Predicting user churn on streaming services using recurrent neural networks / Förutsägande av användarens avbrott på strömmande tjänster med återkommande neurala nätverk

Martins, Helder January 2017 (has links)
Providers of online services have witnessed a rapid growth of their user base in the last few years. The phenomenon has attracted an increasing number of competitors determined on obtaining their own share of the market. In this context, the cost of attracting new customers has increased significantly, raising the importance of retaining existing clients. Therefore, it has become progressively more important for the companies to improve user experience and ensure they keep a larger share of their users active in consuming their product. Companies are thus compelled to build tools that can identify what prompts customers to stay and also identify the users intent on abandoning the service. The focus of this thesis is to address the problem of predicting user abandonment, also known as "churn", and also detecting motives for user retention on data provided by an online streaming service. Classical models like logistic regression and random forests have been used to predict the churn probability of a customer with a fair amount of precision in the past, commonly by aggregating all known information about a user over a time period into a unique data point. On the other hand, recurrent neural networks, especially the long short-term memory (LSTM) variant, have shown impressive results for other domains like speech recognition and video classification, where the data is treated as a sequence instead. This thesis investigates how LSTM models perform for the task of predicting churn compared to standard nonsequential baseline methods when applied to user behavior data of a music streaming service. It was also explored how different aspects of the data, like the distribution between the churning and retaining classes, the size of user event history and feature representation influences the performance of predictive models. The obtained results show that LSTMs has a comparable performance to random forest for churn detection, while being significantly better than logistic regression.  Additionally, a framework for creating a dataset suitable for training predictive models is provided, which can be further explored as to analyze user behavior and to create retention actions that minimize customer abandonment. / Leverantörer av onlinetjänster har bevittnat en snabb användartillväxt under de senaste åren. Denna trend har lockat ett ökande antal konkurrenter som vill ta del av denna växande marknad. Detta har resulterat i att kostnaden för att locka nya kunder ökat avsevärt, vilket även ökat vikten av att behålla befintliga kunder. Det har därför gradvis blivit viktigare för företag att förbättra användarupplevelsen och se till att de behåller en större andel avanvändarna aktiva. Företag har därför ett starkt intresse avatt bygga verktyg som kan identifiera vad som driver kunder att stanna eller vad som får dem lämna. Detta arbete fokuserar därför på hur man kan prediktera att en användare är på väg att överge en tjänst, så kallad “churn”, samt identifiera vad som driver detta baserat på data från en onlinetjänst.   Klassiska modeller som logistisk regression och random forests har tidigare använts på aggregerad användarinformation över en given tidsperiod för att med relativt god precision prediktera sannolikheten för att en användare kommer överge produkten.  Under de senaste åren har dock sekventiella neurala nätverk (särskilt LSTM-varianten Long Short Term Memory), där data istället behandlas som sekvenser, visat imponerande resultat för andra domäner såsom taligenkänning och videoklassificering. Detta arbete undersöker hur väl LSTM-modeller kan användas för att prediktera churn jämfört med traditionella icke-sekventiella metoder när de tillämpas på data över användarbeteende från en musikstreamingtjänst. Arbetet undersöker även  hur olika aspekter av data påverkar prestandan av modellerna inklusive distributionen mellan gruppen av användare som överger produkten mot de som stannar, längden av användarhändelseshistorik och olika val av användarfunktioner för modeller och användardatan. De erhållna resultaten visar att LSTM har en jämförbar prestanda med random forest för prediktering av användarchurn  samt är signifikant bättre än logistisk regression. LSTMs visar sig således vara ett lämpligt val för att förutsäga churn på användarnivå. Utöver dessa resultat utvecklades även ett ramverk  för att skapa dataset som är lämpliga för träning av prediktiva modeller, vilket kan utforskas ytterligare för att analysera användarbeteende och för att skapa förbättrade åtgärder för att behålla användare och minimera antalet kunder som överger tjänsten.

Page generated in 0.0521 seconds