• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6177
  • 1947
  • 1
  • Tagged with
  • 8125
  • 8125
  • 8125
  • 8064
  • 904
  • 830
  • 659
  • 626
  • 614
  • 560
  • 466
  • 400
  • 397
  • 335
  • 334
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Game mechanics’ effects on user retention

Hall, Henning, Lantz, Alexander January 2017 (has links)
In the era of smartphones and millions of apps just a few taps away, the effort required to retain the users in a particular app is increasing. Gamification is a relatively new tool to increase user engagement and is used in a wide range of apps belonging to categories far away from traditional games. Game mechanics, like achievements and leaderboards, belongs to the fundamentals of gamification. This master thesis examines how some of these game mechanics affect user retention. A custom made app game was implemented as an experiment to measure the effects of different game mechanics during a couple of months. The results indicate that game mechanics does affect some types of user retention but also that it might have no or negative impact of other types. In the end, the purpose of the app might play a central role in the decision making of picking the "right" mechanic. This demonstrates the importance of knowledge about game mechanics and their effects.
122

Mapping the current state of SSL/TLS / Kartläggning av nuvarande användning av SSL/TLS

Lindström, Per, Pap, Oscar January 2017 (has links)
The focus of this thesis is to analyze cryptographic protocol and algorithm usage based on three different data sets where the largest set contains one million websites. Ciphers, algorithms and different versions of cryptographic protocols are described and discussed. Some of the common attacks that make the ciphers or protocols obsolete are also presented and explained. In order to obtain the data sets and analyze them, a web crawler and a prober program were developed. The results from the analysis show that more popular websites have better security compared to the not so popular ones. The results also show that many broken and insecure ciphers and protocols are still very popular today. The thesis discusses the use of old software as a potential reason for why so many websites support weak protocols and ciphers.
123

Using Artificial Intelligence to Verify Authorship of Anonymous Social Media Posts

Lagerholm, Filip January 2017 (has links)
The widespread use of social media, along with the possibilities to conceal one’s identity in the fibrillation of ubiquitous technology, combined with crime and terrorism becoming digitized, has increased the need of possibilities to find out who hides behind an anonymous alias. This report deals with authorship verification of posts written on Twitter, with the purpose of investigating whether it is possible to develop an auxiliary tool that can be used in crime investigation activities. The main research question in this report is whether a set of tweets written by an anonymous user can be matched to another set of tweets written by a known user, and, based on their linguistic styles, if it is possible to calculate a probability of whether the authors are the same. The report also examines the question of how linguistic styles can be extracted for use in an artificially intelligent classification, and how much data is needed to get adequate results. The subject matter is interesting as the work described in this report concerns a potential future scenario where digital crimes are difficult to investigate with traditional network-based tracking techniques. The approach to the problem is to evaluate traditional methods of feature extraction in natural language processing, and by classifying the features using a type of recurrent neural network called Long Short-Term Memory. While the best result in an experiment that was carried out achieved an accuracy of 93.32%, the overall results showed that the choice of representation, and amount of data used, is crucial. This thesis complements the existing knowledge as very short texts, in the form of social media posts, are in focus.
124

En undersökning av dagens trådlösa nätverk i skolmiljö

Åkerlind, Klas January 2017 (has links)
I dagens samhälle är molntjänster något som blir alltmer populärt. Företag flyttar sina tjänster till molnet för att ha större tillgång till material och för att lättare hantera tillväxt. Samtidigt som molntjänster kan avlasta lokala tjänster på ett nätverk, så kan det även innebära en ökad internetanvändning. Det finns många faktorer att avväga vid en övergång till molntjänster såsom funktionalitet, kostnader, säkerhet med mera. Vårt arbete försöker beskriva vilka möjliga konsekvenser den ökade internetanvändningen kan få i ett nätverk. I vår studie undersöker vi en grundskola i Söderhamns kommun som ska börja använda molntjänster till hösten 2017. För att få en bättre uppfattning av vilka problem som kan uppstå i och med denna uppgradering, så studerar vi både nätverkets infrastruktur på skolan samt molntjänsternas tänkta användningsområde. Mängden nätverkstrafik som genereras på skolan mäts och jämförs med en annan skola som redan använder sig av molntjänster, och belastningen på trådlösa accesspunkter i skolan ses över. I detta fall kommer vi fram till att mängden användare som ansluter sig till internet via trådlösa accesspunkter i skolan överstiger rekommenderat antal samtidiga användare på vissa platser. Det finns även risk att trafikmängden ökar i nätverket, baserat på mätningar och statistik, beroende på den omfattning elever i mellanstadiet och nedåt använder internet och digitala läromedel.
125

Taktila hjälpmedel för hörselskadade inom musiken / Vibrotactile aid for the hearing impaired whitin music

Pettersson, Sebastian, Palfi, Kristoffer January 2017 (has links)
No description available.
126

The Design and Implementation of a Subscription-Based E-shop That Is Easily Navigated and Visually Appealing to Its Users / Design och implementation av en prenumerationsbaserad e-shop som är lättnavigerad och visuellt tilltalande för dess användare

Erkén, Amanda, Cederblad, Cecilia, Saller, Christophe, Wahlström, Daniel, Odlinder Haubo, Erik, Lewenhaupt, Hugo, Hellström, Jesper, Wallin, Johan, Andreasson, Oscar January 2017 (has links)
The report presents a study of the design and implementation of a web application with the purpose of being easy to navigate and understand as well as inducing a positive attitude in its users. A pilot study and marketing plan acted as the foundation for the implementation of the application. The evaluation consisted of a Thinking aloud usability test as well as a user attitude  test. An analysis of the results indicated a web application which fulfilled the demands that the research group affirmed and it was ultimately concluded that factors such as visual feedback, clarity in the purchasing process and fixed navbar were essential to achieve the aforementioned purpose. Lastly, it was also summarized that the research done was not extensive enough to be able to draw general conclusions regarding inducing positive attitudes.
127

Comparison of hardware firewalls in a network environment

Elnerud, Albin January 2017 (has links)
Today’s market offers a wide range of available firewalls, there are many manufacturers andeach of them has at least several series of possible solutions. As organisations and companiesseek to protect their assets against current and new hostile threats, the demands for networksecurity increases and drives the development of firewalls forward. With new firewalltechnologies emerging from a wide variety of firewall vendors, choosing the right firewall canbe both costly and time consuming. Requirements for a concrete network are needed to becorrelated with security functionalities, i.e., metrics for firewalls. Incorrect requirementsformulation or their incorrect mapping to metrics can lead to a financial loss or a firewallfailure in providing desired security functionalities. In this thesis, firewalls from three differentmanufacturers are investigated. Firewalls are compared and evaluated by using requirementsderived for Eskilstuna municipals network. To identify solutions fulfilling the requirements,metrics related to the requirements are identified. Two different placements for firewalldeployment are considered separately, as they have different requirements. The firewallcomparison consists of two steps. The first step of the comparison is done by evaluatingfirewalls from each manufacturer separately. After the best suited firewall from eachmanufacturer has been identified, the second step in the comparison is performed. The steptwo consists of comparing the best solution from each manufacturer between each other. Theoutcome of the comparison is a firewall solution that fulfills all requirements and can beconsidered as optimal choice for the investigated network environment.
128

Design and evaluation of a system that coordinate clients to use the same server / Design och utvärdering av ett system som koordinerar klienter att använda samma server.

Gustavsson, Per, Gavefalk, Erica January 2017 (has links)
This thesis present a system which examine when clients should select a new server to maximize the number of packets received with the fewest number of server selections. The ratio between the number of packets received and the number of server selections should therefore be as high as possible. To help minimize the number of server selections, the system should be adaptive if the conditions would change. We construct a server selection mechanism which guarantees agreement between clients when called upon by the system. Furthermore, four policies have been created for when the mechanism is to be called. The purpose of this is to find out when and how it is best to switch server and how they can adapt to varying connection failure rates between server and clients. We evaluate the system by running simulations concluding that the policy we call on demand is best, which calls the server selection mechanism when clients have no connection to the server. This policy is the best if the aim is to send as many packets as possible and that the server selection mechanism is called few times with adaption to varying connection failure rates. This would enable the system to provide information even if the conditions will change to the better or to the worse.
129

Using NLP and context for improved search result in specialized search engines

Martin, Carlstedt January 2017 (has links)
No description available.
130

Data Mining of Trouble Tickets for Automatic Action Recommendation

Löfgren, Jonathan January 2017 (has links)
This work investigates the possibility of applying machine learning and data mining to the problem of finding solutions to software and hardware problems arising in telecommunication systems. Trouble ticket data is analyzed using traditional data mining techniques and more complex machine learning models, including neural networks, to find out which types of models are suitable for the task. Results show that there are relevant correlations in the data which enables the root cause to predicted with up to 90% accuracy in the case of the most common root cause, and up to 70% when classifying between up to 20 root causes. These predictive models could be used to assist the engineers by giving them probably suggestions for the root cause, and potentially save time in the troubleshooting process. Relatively simple data mining and linear models performed best which shows that this could be implemented in practice given that these methods are fast, robust, memory efficient and easy to implement. Neural networks were also investigated but gave no significant improvement in the results, athough there are indications that they could outperform linear models if more training data was available. A large fraction of the collected data could not be used for analysis because of missing values and other inconsistencies, which highlights the importance of defining standards for the data collection process. This would lead to higher quality data, and allow the trained models to be more general and perform well in multiple locations.

Page generated in 0.1035 seconds