• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 174
  • 158
  • 138
  • 13
  • 8
  • 7
  • 7
  • 4
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 547
  • 215
  • 169
  • 124
  • 119
  • 98
  • 97
  • 93
  • 92
  • 84
  • 79
  • 74
  • 67
  • 63
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
531

En studie av hur en webbapplikation för annonsering av konsultuppdrag till studenter kan implementeras för att uppfattas som användbar / A study of how a web application for advertising consulting jobs to students can be implemented to be perceived as useful

Åström, Adam, Öberg, Albin, Elkjaer, Alice, Olsson, Fredrik, Bengtsson Malmborg, Hannes, Jacobson, Madeleine, Schwartz-Blicke, Oscar, Storsved, Viktor January 2021 (has links)
Studentkonsultprojekt gör det möjligt för studenter att applicera sin kunskap i näringslivet samtidigt som det blir mindre kostsamt för företagen att anlita konsulter. Då jobbsökande via internet blir allt vanligare finns det ett behov av en webbapplikation för konsultuppdrag som kopplar samman studenter och företag. En av de viktigaste aspekterna för att skapa en konkurrenskraftig webbapplikation är användbarheten. Således är intentionen med denna studie att undersöka Hur kan en webbapplikation för konsultuppdrag mellan företag och studenter implementeras för att uppfattas som användbar av studenter? För att besvara frågeställningen har en webbapplikation för förmedling av konsulttjänster mellan företag och studenter utvecklats. Webbapplikationen baseras på en teoretisk grund där olika dimensioner av begreppet användbarhet analyserats. De dimensioner som lyfts är effektivitet, ändamålsenlighet och tillfredsställelse. I tillägg till detta har vikten av att specificera användare och de estetiska aspekternas påverkan på användbarhet behandlats. För att utvärdera om webbapplikationen upplevs som användbar testas den på tre testgrupper i tre olika skeden för att undersöka deras upplevelse av webbapplikationen. Testerna utgår från metoden thinking aloud tillsammans med enkäterna System Usability Scale (SUS) och Visual Aesthetics of Websites Inventory Short (VisAWI-S). SUS- och VisAWI-S-enkäterna gav indikationer på en starkt användbar applikation genom hela utvecklingsprocessen. Detta utifrån implementation av en design som främst utgick från principerna enkelhet och färgrikedom samt fokusområdena Synlighet av systemstatus, Igenkänning istället för återkallande, Flexibilitet och effektiv användning och Estetisk och minimalistisk design. Genom att analysera resultaten från thinking aloud-testerna kunde en tydlig minskning av negativa kommentarer identifieras mellan användartest 1-3. Utifrån dessa testresultat, dras slutsatserna att genom återkoppling relaterad till utförda aktioner, implementation av markörer och färgval med hänsyn till kontraster kan en webbapplikation för konsultjobb implementeras för att uppfattas som användbar av studenter. / Studentconsulting makes it possible for students to use their knowledge in business cases. In addition to this it also reduces the cost for enterprises when hiring consultants. As job hunting via the internet becomes more common there is a need for a web application that connects company projects with students who are interested in consulting. One of the most prominent aspects for a web application to be competitive is usability. The intention with this study is to examine How can a webapplication for consulting jobs between enterprises and students be implemented to be perceived as useful by students? To answer this question a web application for intermediation of consulting jobs between enterprises and students has been developed. The web application is developed on a theoretical basis where the different dimensions of the term usability has been analysed. These dimensions are efficiency, effectiveness and satisfaction. In addition to this the importance of specifying users and the aesthetic aspects effects on usability has been discussed. To evaluate if the web aplication is perceived as useful it is tested on three groups of people on three different occasions to examine their perception of the web application. The tests are based on the thinking aloud method together with the questionnaires System Usability Scale (SUS) and Visual Aesthetics of Websites Inventory Short (VisAWI-S). The SUS and VisAWI-S questionnaires indicated that the web application had a high level of usability throughout the development process. This was achieved through implementing a design based on simplicity and colorfulness as well as the principles Visibility of system status, Recognition rather than recall, Flexibility and efficiency of use and Aesthetic and minimalist design. By analysing the results from the thinking aloud tests a reduction in negative comments between test 1 and 3 could be identified. From the test results, the conclusion is that through responses related to completed actions, implementation of markers, and contrasting colors a web application for consulting jobs between enterprises and students can be implemented to be perceived as useful by students.
532

Σχεδίαση και ανάπτυξη διαδικτυακής εφαρμογής υποστήριξης μελετών χρηστών

Δημογιάννης, Δημήτριος 13 October 2013 (has links)
Στόχος της παρούσης διπλωματικής είναι η δημιουργία μιας διαδικτυακής εφαρμογής όπου θα υποστηρίζει τη δημιουργία, τη διαχείριση και την εκτέλεση μιας μελέτης αξιολόγησης γραφικού σχεδιασμού διεπιφανειών χρήστη, με τη μέθοδο της χαρτογράφησης προτίμησης, καθώς επίσης και τη συλλογή και αποθήκευση των αποτελεσμάτων. Μετά από βιβλιογραφική έρευνα κρίθηκε σκόπιμο να υλοποιηθεί η μέθοδος της χαρτογράφησης προτίμησης για τη διεξαγωγή της μελέτης. Στη συνέχεια προσδιορίστηκαν οι αρχές πάνω στις οποίες βασίστηκε η σχεδίαση και η ανάπτυξη της εφαρμογής. Η παρούσα εφαρμογή είναι πλούσιας διαδραστικότητας, μπορεί να υλοποιήσει αξιολόγηση από απόσταση και να ικανοποιεί τις απαιτήσεις του ανταποκρινόμενου σχεδιασμού, ενώ ταυτόχρονα παρέχεται η δυνατότητα προσθήκης νέων μεθόδων. Για την υλοποίηση της εφαρμογής χρησιμοποιήθηκαν οι νέες τεχνολογίες διαδικτύου HTML5, CSS3, jQuery καθώς και οι PHP, MySQL, JavaScript, AJAX. Η τελική αξιολόγηση από ειδικούς ευχρηστίας του εργαστηρίου αλληλεπίδρασης ανθρώπου υπολογιστή, έκρινε την εφαρμογή ικανή να εκπληρώσει το στόχο της και έδωσε θετική ανάδραση για περεταίρω βελτίωση. / The aim of the present diploma thesis is the development of a web application which supports the creation, management and execution of a graphic design evaluation study by implementing the method of preference mapping, as well as the collection and storage of results. After extended literature research the method of preference mapping was found to be the most suitable for conducting the study. Subsequently, the main principles were identified upon which the design and development of the application was based on. This rich internet application is able to conduct remote evaluation, while achieving the purpose of responsive design which was a fundamental requirement. It also provides the ability for new evaluation methods to be added. For the implementation of the application new internet technologies as HTML5, CSS3, jQuery were used, as well as PHP, MySQL, JavaScript and AJAX. The usability evaluation of this application was conducted by usability experts of the human computer interaction laboratory. The conclusion of the experts was that the application is highly usable and their recommendations provided effective feedback.
533

Evaluierung von AJAX-basierten frameworks für das Web 2.0

Langer, André 20 April 2007 (has links) (PDF)
„Remote Scripting“-Anwendungen erleben seit einigen Jahren einen regelrechten Anfrageboom. Während aus usability-Sicht bisher eine strikte Unterscheidung zwischen Desktop-Anwendungen und Webapplikationen herrschte, finden sich seit einiger Zeit zunehmend Angebote im World Wide Web, die diese strikte Trennung verwischen lassen. Interaktive Nutzerdialoge, nebenläufige Prozessabarbeitung und visuelle Unterstützungsmittel wie Drag & Drop- Effekte halten auf Webseiten Einzug, die dem Nutzer bisher nur aus eigenständigen Softwareprodukten in einer spezifischen Betriebssystemumgebung bekannt waren. Viele dieser neuen Anwendungs- und Interaktionsmöglichkeiten im weltweiten Datennetz werden inzwischen unter dem Oberbegriff Web 2.0 zusammengefasst. Für den Nutzer bringt dieser neue Entwicklungstrend viele Vorteile: Ansprechende, intuitive Nutzerführungen ohne die Notwendigkeit, eine ganze Internetseite bei jedem Interaktionsschritt neu zu laden und ohne bemerkbaren zeitlichen Overhead. Was für den Nutzer Erleichterung bringen soll, bedeutet häufig für einen Programmierer zunächst Mehraufwand. Eine Technik zur Realisierung solcher so genannten Rich Internet Applications, die sich in den letzten beiden Jahren immer mehr in den Vordergrund gedrängt hat, wird unter der Bezeichnung AJAX zusammengefasst. Einen einheitlichen Standard gibt es dabei nicht, sodass fast täglich neue AJAX-basierte frameworks veröffentlicht werden, die dem Programmierer (wenigstens einen Teil der) Komplexität der Programmflusssteuerung abnehmen sollen. Aufgabe der Studienarbeit soll es daher sein, das inzwischen unüberschaubar gewordene Angebot an AJAX frameworks zu systematisieren und einen Überblick über Vor- und Nachteile ausgewählter Programmbibliotheken zu geben. Dafür ist ein Kriterienkatalog zu erarbeiten, der eine Bewertung der verschiedenen frameworks nach unterschiedlichen Gesichtspunkten ermöglicht. Besonderer Schwerpunkt ist dabei auf Kriterien aus Programmierersicht (Sprachunabhängigkeit, Overhead, Implementierungsmöglichkeiten,…) und Anwendersicht (Plattformanforderungen, Einarbeitungszeit, Ergebnisqualität, …) zu legen. Auf den Kriterienkatalog ist anschließend eine Auswahl an bereits existierenden, frei verfügbaren AJAX frameworks anzuwenden, die als zukünftig relevant eingeschätzt werden. Die Ergebnisse sind abschließend in einer Gesamtübersicht zu präsentieren, die eine objektive Empfehlung für Nutzer darstellen soll, die vor der Wahl stehen, welche AJAX Programmbibliothek sie zukünftig einsetzen sollten.
534

Evaluierung von AJAX-basierten frameworks für das Web 2.0

Langer, André. Anders, Jörg. January 2007 (has links)
Chemnitz, Techn. Univ., Studienarb., 2007.
535

Snow depth measurements and predictions : Reducing environmental impact for artificial grass pitches at snowfall

Forsblom, Findlay, Ulvatne, Lars Petter January 2020 (has links)
Rubber granulates, used at artificial grass pitches, pose a threat to the environment when leaking into the nature. As the granulates leak to the environment through rain water and snow clearances, they can be transported by rivers and later on end up in the marine life. Therefore, reducing the snow clearances to its minimum is of importance. If the snow clearance problem is minimized or even eliminated, this will have a positive impact on the surrounding nature. The object of this project is to propose a method for deciding when to remove snow and automate the information dispersing upon clearing or closing a pitch. This includes finding low powered sensors to measure snow depth, find a machine learning model to predict upcoming snow levels and create an application with a clear and easy-to-use interface to present weather information and disperse information to the responsible persons. Controlled experiments is used to find the models and sensors that are suitable to solve this problem. The sensors are tested on a single snow quality, where ultrasonic and infrared sensors are found suitable. However, fabricated tests for newly fallen snow questioned the possibility of measuring snow depth using the ultrasonic sensor in the general case. Random Forest is presented as the machine learning model that predicts future snow levels with the highest accuracy. From a survey, indications is found that the web application fulfills the intended functionalities, with some improvements suggested.
536

On the security of authentication protocols on the web / La sécurité des protocoles d’authentification sur leWeb

Delignat-Lavaud, Antoine 14 March 2016 (has links)
Est-il possible de démontrer un théorème prouvant que l’accès aux données confidentielles d’un utilisateur d’un service Web (tel que GMail) nécessite la connaissance de son mot de passe, en supposant certaines hypothèses sur ce qu’un attaquant est incapable de faire (par exemple, casser des primitives cryptographiques ou accéder directement aux bases de données de Google), sans toutefois le restreindre au point d’exclure des attaques possibles en pratique?Il existe plusieurs facteurs spécifiques aux protocoles du Web qui rendent impossible une application directe des méthodes et outils existants issus du domaine de l’analyse des protocoles cryptographiques.Tout d’abord, les capacités d’un attaquant sur le Web vont largement au-delà de la simple manipulation des messages échangés entre le client et le serveur sur le réseau. Par exemple, il est tout à fait possible (et même fréquent en pratique) que l’utilisateur ait dans son navigateur un onglet contenant un site contrôlé par l’adversaire pendant qu’il se connecte à sa messagerie (par exemple, via une bannière publicitaire) ; cet onglet est, comme n’importe quel autre site, capable de provoquer l’envoi de requêtes arbitraires vers le serveur de GMail, bien que la politique d’isolation des pages du navigateur empêche la lecture directe de la réponse à ces requêtes. De plus, la procédure pour se connecter à GMail implique un empilement complexe de protocoles : tout d’abord, un canal chiffré, et dont le serveur est authentifié, est établi avec le protocole TLS ; puis, une session HTTP est créée en utilisant un cookie ; enfin, le navigateur exécute le code JavaScript retourné par le client, qui se charge de demander son mot de passe à l’utilisateur.Enfin, même en imaginant que la conception de ce système soit sûre, il suffit d’une erreur minime de programmation (par exemple, une simple instruction goto mal placée) pour que la sécurité de l’ensemble de l’édifice s’effondre.Le but de cette thèse est de bâtir un ensemble d’outils et de librairies permettant de programmer et d’analyser formellement de manière compositionelle la sécurité d’applicationsWeb confrontées à un modère plausible des capacités actuelles d’un attaquant sur le Web. Dans cette optique, nous étudions la conception des divers protocoles utilisés à chaque niveau de l’infrastructure du Web (TLS, X.509, HTTP, HTML, JavaScript) et évaluons leurs compositions respectives. Nous nous intéressons aussi aux implémentations existantes et en créons de nouvelles que nous prouvons correctes afin de servir de référence lors de comparaisons. Nos travaux mettent au jour un grand nombre de vulnérabilités aussi bien dans les protocoles que dans leurs implémentations, ainsi que dans les navigateurs, serveurs, et sites internet ; plusieurs de ces failles ont été reconnues d’importance critiques. Enfin, ces découvertes ont eu une influence sur les versions actuelles et futures du protocole TLS. / As ever more private user data gets stored on the Web, ensuring proper protection of this data (in particular when it transits through untrusted networks, or when it is accessed by the user from her browser) becomes increasingly critical. However, in order to formally prove that, for instance, email from GMail can only be accessed by knowing the user’s password, assuming some reasonable set of assumptions about what an attacker cannot do (e.g. he cannot break AES encryption), one must precisely understand the security properties of many complex protocols and standards (including DNS, TLS, X.509, HTTP, HTML,JavaScript), and more importantly, the composite security goals of the complete Web stack.In addition to this compositional security challenge, onemust account for the powerful additional attacker capabilities that are specific to the Web, besides the usual tampering of network messages. For instance, a user may browse a malicious pages while keeping an active GMail session in a tab; this page is allowed to trigger arbitrary, implicitly authenticated requests to GMail using JavaScript (even though the isolation policy of the browser may prevent it from reading the response). An attacker may also inject himself into honest page (for instance, as a malicious advertising script, or exploiting a data sanitization flaw), get the user to click bad links, or try to impersonate other pages.Besides the attacker, the protocols and applications are themselves a lot more complex than typical examples from the protocol analysis literature. Logging into GMail already requires multiple TLS sessions and HTTP requests between (at least) three principals, representing dozens of atomic messages. Hence, ad hoc models and hand written proofs do not scale to the complexity of Web protocols, mandating the use of advanced verification automation and modeling tools.Lastly, even assuming that the design of GMail is indeed secure against such an attacker, any single programming bug may completely undermine the security of the whole system. Therefore, in addition to modeling protocols based on their specification, it is necessary to evaluate implementations in order to achieve practical security.The goal of this thesis is to develop new tools and methods that can serve as the foundation towards an extensive compositional Web security analysis framework that could be used to implement and formally verify applications against a reasonably extensive model of attacker capabilities on the Web. To this end, we investigate the design of Web protocols at various levels (TLS, HTTP, HTML, JavaScript) and evaluate their composition using a broad range of formal methods, including symbolic protocol models, type systems, model extraction, and type-based program verification. We also analyze current implementations and develop some new verified versions to run tests against. We uncover a broad range of vulnerabilities in protocols and their implementations, and propose countermeasures that we formally verify, some of which have been implemented in browsers and by various websites. For instance, the Triple Handshake attack we discovered required a protocol fix (RFC 7627), and influenced the design of the new version 1.3 of the TLS protocol.
537

Informační systém pro podporu prodeje / Information system for sale support

Štupák, Branislav January 2018 (has links)
The diploma thesis deals with design and implementation of the information system for sale support. In the first part of the thesis is done market analysis and described legislative requirements. These theoretical findings are further applied in the design part of the system. System is designed in spirit of good user experience(UX). Application is created with offline-first approach by React Native framework for cross-platform development of mobile applications. Synchronization between end point devices is done by CouchDB database system.
538

Získávání informací o uživatelích na webových stránkách / Browser and User Fingerprinting for Practical Deployment

Vondráček, Tomáš January 2021 (has links)
The aim of the diploma thesis is to map the information provided by web browsers, which can be used in practice to identify users on websites. The work focuses on obtaining and subsequent analysis of information about devices, browsers and side effects caused by web extensions that mask the identity of users. The acquisition of information is realized by a designed and implemented library in the TypeScript language, which was deployed on 4 commercial websites. The analysis of the obtained information is carried out after a month of operation of the library and focuses on the degree of information obtained, the speed of obtaining information and the stability of information. The dataset shows that up to 94 % of potentially different users have a unique combination of information. The main contribution of this work lies in the created library, design of new methods of obtaining information, optimization of existing methods and the determination of quality and poor quality information based on their level of information, speed of acquisition and stability over time.
539

Využití metod dolování dat pro analýzu sociálních sítí / Using of Data Mining Method for Analysis of Social Networks

Novosad, Andrej January 2013 (has links)
Thesis discusses data mining the social media. It gives an introduction about the topic of data mining and possible mining methods. Thesis also explores social media and social networks, what are they able to offer and what problems do they bring. Three different APIs of three social networking sites are examined with their opportunities they provide for data mining. Techniques of text mining and document classification are explored. An implementation of a web application that mines data from social site Twitter using the algorithm SVM is being described. Implemented application is classifying tweets based on their text where classes represent tweets' continents of origin. Several experiments executed both in RapidMiner software and in implemented web application are then proposed and their results examined.
540

Vizualizace georeferencovaných informací na webovém mapovém rozhraní / Georeferenced Data Visualization on Web-Based Map Interface

Růžička, Štěpán January 2011 (has links)
The master's thesis is concerned with design and implementation of library extending OpenLayers. For the solution was used JavaScript programming language. Part of this thesis is devoted to description of standards for maintaining and transfering geographic information and to JavaScript map presenting libraries and REST services.

Page generated in 0.0324 seconds