• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 118
  • 117
  • 67
  • 20
  • 15
  • 10
  • 9
  • 8
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 403
  • 127
  • 120
  • 116
  • 112
  • 93
  • 66
  • 59
  • 58
  • 58
  • 49
  • 44
  • 41
  • 41
  • 39
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Qt vs. ElectronA Point Cloud Performance Comparison &Investigation of the Qt Framework

Stenius, Robin January 2022 (has links)
Frameworks for developing cross-platform applications come in many forms, and depending on thefunctionality of the developed application, some platforms may be a better choice. Applications workingwith point cloud models consist of huge amounts of data points, created by scanning an objectwith a laser scanner which can then be loaded into the software for display and interaction.This study looks at performance differences between two cross-platform desktop application frameworks,Qt and Electron, working with point clouds by performing an experiment. Two prototypeswere used to measure the differences in the time it takes to create the point cloud, allocated memoryfor the data points, and also the average frames per second achieved throughout a rotation sequenceinitiated on the point cloud. This study is conducted on-sight at an organization currently using theQt framework and wanted to investigate potential differences with an HTML5 framework. This studyalso investigates what expert practitioners working with the Qt framework experience its strength andlimitations are, by conducting semi-structured interviews to gain in-depth knowledge.This study found that the differences between the frameworks can not be drawn clearly, as thereare a lot of variables influencing performance outcomes. However, under these conditions Qt generallyperformed better on all occasions. Performance is one way to measure an application and framework,and this study found that working with the Qt framework has its strong points as well as weaknesses.The performance and cross-platform capabilities of Qt are well-liked, but it can come at the cost ofpoor documentation and high complexity of developing bigger applications. Using QtQuick (QML) todevelop the GUI is generally appreciated for how it separates the front-end GUI from C++ togetherwith the available modules. However, it can take time to learn QML and may not completely removethe need of C++ development.
232

Digital CV for Axture : Made in wordpress for clients and employees

Johansson, John January 2022 (has links)
The goal of this project is to remake a static html page into a dynamic page in the CSM WordPress. The page itself is a digital CV used for presenting people looking for work to employers. It should fulfil as many as WCAG "Web Content Accessibility Guidelines " as possible. While giving employees an easy way to create new CV pages. This report will go through the work at creating a WordPress theme from a static html page, it will go through the overall as well as the concrete goals, the importance of following WCAG, the whole processes of creating the site and the problems WordPress inherently brings, it goes through testing and conclusions it gives examples of code and explanations on how that code works as well as illustration, it will go through the conclusions and results from those and it will talk about the ethicality of this kind of work. / Målet med detta projekt är att göra om en statisk HTML-sida till en dynamisk sida i CSM:et WordPress. Själva sidan är ett digitalt CV som används för att presentera personer som söker arbete för arbetsgivare. Den bör uppfylla så många av WCAG "Web Content Accessibility Guidelines" som möjligt. Samtidigt som de ger anställda ett enkelt sätt att skapa nya CV-sidor. Denna rapport kommer att gå igenom arbetet med att skapa ett WordPress-tema från en statisk html-sida kommer den att gå igenom såväl de övergripande som de konkreta målen, vikten av att följa WCAG, hela processerna för att skapa sajten och de problem WordPress inneboende medför, den går igenom tester och slutsatser den ger exempel på kod och förklaringar om hur den koden fungerar samt illustrationer, den kommer att gå igenom slutsatserna och resultaten från dessa och det kommer att tala om etikiteten i denna typ av arbete.
233

Location aware web access

Charvandeh, Jenny January 2009 (has links)
The user's mobile communication device has an increasing sense of where the user is. This location information may be very fine grained or very coarse. Given some amount of location information it is possible to create location aware services. This thesis presents and evaluates a system for location aware web browsing. Indoors the user can click on a point on a map (to establish a virtual location using a previously installed user application), outdoors the location can be provided by GPS, or the location might be provided by some other location system (indoors or outdoors), then each HTTP GET request for a URL will be augmented with information about the user's location or their virtual location. Subsequently a web query is created. Then the location information encoded as longitude and latitude is appended to this web query. The web server uses this location information to generate dynamically location aware web pages. Finally a web browser shows the web pages. / Tillgång till information varsomhelst och vilken tid som helst är en viktig utkom av modern rörliga kommunikations systems. Alltmera har användarens terminal kännedom om användarens plats. Informationen om platsen kan vara lite eller omfattande. Tillgång till information om platsen gör det möjligt att skapa platsmedvetna tjänster. I den här master thesis presenterar och utvärderar jag ett system för plats medvetna web användning. Användaren klickar på en punkt på en karta (för att inrätta en virtuell lokalisering genom att använda tidigare installerat applikationer), sedan deras HTTP GET request för en URL utvidgas med information om användarens position eller deras virtuella (verkliga) lokalisering. En platsmedveten web query har skapats så att information om plats som latitude och longitude läggs till denna web query. Sedan en web server använder denna information för att generera dynamiska web sidor.
234

Webbserver från ASP.NET 4.8 till Blazor Server

Söderlund, Malin January 2021 (has links)
This report aims to address the design of a web server with Blazor Server for the company FunRock and their mobile strategy game MMA Manager. The web server is intended to administer settings of the game's different components, where staff can, for example, search for a user or a user's belongings. The report is limited to the design of specific pages from the most frequently used ones on the web server. Furthermore, the report wishes to address usability and accessibility analysis of the previous web server due to the basis for the design of the new Blazor Server application. The purpose of converting to Blazor Server has been to contribute to more flexible hosting, faster development and better performance than their previous one, written in ASP.NET 4.8. Finally, the later part of the report addresses a more subjective analysis of the author around the work performed where reflections on the project's results are treated. / Denna rapport önskar behandla utformningen av en webbserver med Blazor Server åt företaget FunRock och deras mobila strategispel MMA Manager. Webbservern ämnar till att administrera inställningar av spelets beståndsdelar, där personal kan exempelvis söka efter en användare eller en användares tillhörigheter. Rapporten erhåller avgränsning till utformning av specifika sidor från den tidigare webbservern som används mest frekvent. Vidare önskar rapporten behandla användbarhets- och tillgänglighetsanalys av den tidigare webbservern med anledning att agera grund till den nya Blazor Server applikationens design. Syftet med utformningen av den nya webbservern har varit att bidra till mer flexibel hostning, snabbare utveckling och bättre prestanda än deras tidigare som var skriven i ASP.NET 4.8. Slutligen tar rapportens senare del upp en mer subjektiv analys av författaren kring det utförda arbetet där reflektioner över projektets resultat behandlas.
235

Historical Maps : En interaktiv upplevelse av historia genom kartor

Nilsson, Susanne January 2021 (has links)
This report deals with the development of the historical maps web application for a company that works to preserve the cultural heritage of old maps. The goal of the project has been to create a proof of concept for a web application that allows the user to view and interact with historical maps. As well as taking part in the history and facts behind the maps, areas and points of interest within the maps. Work on the application has begun with the foundational design, such as flowchart, mood board and design sketches for the application. Based on these, a proof of concept is developed in the form of an interactive web application written in React with additional libraries such as Materials UI for user interface and Leaflet for interactive map functions. For facts content, fetch calls are made to the Wikipedia API. Trade-offs have had to be made regarding implemented functionality in this first version of the web application, in order to represent how the interactiveness with the maps can work while also demonstrating potential future functionality. The finished application achieves the purpose of the assignment as proof of concept. Interactivity in the maps and connections between the maps are implemented, along with static elements that exists to represent future possibilities. Information and facts about areas of interest and points of interest are retrieved from Wikipedia through the API. / Rapporten avhandlar utvecklingen av webbapplikationen Historical Maps till ett bolag som arbetar med att bevara kulturarvet kring gamla kartor. Målet med projektet har varit att skapa en proof of concept för en webbapplikation som låter användaren ta del av och interagera med historiska kartor. Samt att ta del av historia och fakta bakom kartorna, områden och intressepunkter. Arbetet med applikationen har inletts med att ta fram underlag som flödesschema, moodboard och designskisser för applikationen. Utifrån dessa utvecklas en proof of concept i form utav en interaktiv webbapplikation skriven i React med tillägg som Materials UI för användargränssnitt samt Leaflet för interaktiva kartfunktioner. För faktainnehåll görs fetch-anrop till Wikipedias API. Avvägningar har fått göras vad gäller implementerad funktionalitet i denna första version av webbapplikationen, för att på ett rättvist sätt kunna representera hur interaktiviteten med kartorna kan fungera och samtidigt påvisa potentiell framtida funktionalitet. Den färdiga applikationen uppnår uppdragets syfte som proof of concept. Interaktivitet i kartorna och kopplingar mellan kartorna är implementerad, tillsammans med statiska element som finns på plats för att representera framtida möjligheter. Information och fakta om intressanta områden och intressepunkter hämtas från Wikipedia genom API.
236

Vers une détection des attaques de phishing et pharming côté client / Defeating phishing and pharming attacks at the client-side

Gastellier-Prevost, Sophie 24 November 2011 (has links)
Le développement de l’Internet à haut débit et l’expansion du commerce électronique ont entraîné dans leur sillage de nouvelles attaques qui connaissent un vif succès. L’une d’entre elles est particulièrement sensible dans l’esprit collectif : celle qui s’en prend directement aux portefeuilles des Internautes. Sa version la plus répandue/connue est désignée sous le terme phishing. Majoritairement véhiculée par des campagnes de spam, cette attaque vise à voler des informations confidentielles (p.ex. identifiant, mot de passe, numéro de carte bancaire) aux utilisateurs en usurpant l’identité de sites marchands et/ou bancaires. Au fur et à mesure des années, ces attaques se sont perfectionnées jusqu’à proposer des sites webs contrefaits qui visuellement - hormis l’URL visitée - imitent à la perfection les sites originaux. Par manque de vigilance, bon nombre d’utilisateurs communiquent alors - en toute confiance - des données confidentielles. Dans une première partie de cette thèse, parmi les moyens de protection/détection existants face à ces attaques, nous nous intéressons à un mécanisme facile d’accès pour l’Internaute : les barres d’outils anti-phishing, à intégrer dans le navigateur web. La détection réalisée par ces barres d’outils s’appuie sur l’utilisation de listes noires et tests heuristiques. Parmi l’ensemble des tests heuristiques utilisés (qu’ils portent sur l’URL ou le contenu de la page web), nous cherchons à évaluer leur utilité et/ou efficacité à identifier/différencier les sites légitimes des sites de phishing. Ce travail permet notamment de distinguer les heuristiques décisifs, tout en discutant de leur pérennité. Une deuxième variante moins connue de cette attaque - le pharming - peut être considérée comme une version sophistiquée du phishing. L’objectif de l’attaque reste identique, le site web visité est tout aussi ressemblant à l’original mais - a contrario du phishing - l’URL visitée est cette fois-ci elle aussi totalement identique à l’originale. Réalisées grâce à une corruption DNS amont, ces attaques ont l’avantage de ne nécessiter aucune action de communication de la part de l’attaquant : celui-ci n’a en effet qu’à attendre la visite de l’Internaute sur son site habituel. L’absence de signes "visibles" rend donc l’attaque perpétrée particulièrement efficace et redoutable, même pour un Internaute vigilant. Certes les efforts déployés côté réseau sont considérables pour répondre à cette problématique. Néanmoins, le côté client y reste encore trop exposé et vulnérable. Dans une deuxième partie de cette thèse, par le développement de deux propositions visant à s’intégrer dans le navigateur client, nous introduisons une technique de détection de ces attaques qui couple une analyse de réponses DNS à une comparaison de pages webs. Ces deux propositions s’appuient sur l’utilisation d’éléments de référence obtenus via un serveur DNS alternatif, leur principale différence résidant dans la technique de récupération de la page web de référence. Grâce à deux phases d’expérimentation, nous démontrons la viabilité du concept proposé. / The development of online transactions and "always-connected" broadband Internet access is a great improvement for Internet users, who can now benefit from easy access to many services, regardless of the time or their location. The main drawback of this new market place is to attract attackers looking for easy and rapid profits. One major threat is known as a phishing attack. By using website forgery to spoof the identity of a company that proposes financial services, phishing attacks trick Internet users into revealing confidential information (e.g. login, password, credit card number). Because most of the end-users check the legitimacy of a login website by looking at the visual aspect of the webpage displayed by the web browser - with no consideration for the visited URL or the presence and positioning of security components -, attackers capitalize on this weakness and design near-perfect copies of legitimate websites, displayed through a fraudulent URL. To attract as many victims as possible, most of the time phishing attacks are carried out through spam campaigns. One popular method for detecting phishing attacks is to integrate an anti-phishing protection into the web browser of the user (i.e. anti-phishing toolbar), which makes use of two kinds of classification methods : blacklists and heuristic tests. The first part of this thesis consists of a study of the effectiveness and the value of heuristics tests in differentiating legitimate from fraudulent websites. We conclude by identifying the decisive heuristics as well as discussing about their life span. In more sophisticated versions of phishing attacks - i.e. pharming attacks -, the threat is imperceptible to the user : the visited URL is the legitimate one and the visual aspect of the fake website is very similar to the original one. As a result, pharming attacks are particularly effective and difficult to detect. They are carried out by exploiting DNS vulnerabilities at the client-side, in the ISP (Internet Service Provider) network or at the server-side. While many efforts aim to address this problem in the ISP network and at the server-side, the client-side remains excessively exposed. In the second part of this thesis, we introduce two approaches - intended to be integrated into the client’s web browser - to detect pharming attacks at the client-side. These approaches combine both an IP address check and a webpage content analysis, performed using the information provided by multiple DNS servers. Their main difference lies in the method of retrieving the webpage which is used for the comparison. By performing two sets of experimentations, we validate our concept.
237

Webový prohlížeč pro Squeak Smalltalk / Web Browser for Squeak Smalltalk

Šlemr, Martin January 2008 (has links)
This Master's thesis is about web browser Scamper in Squeak Smalltalk system environment, it's actual progress, new design and implementation, which respect CSS box model and visual formatting model including tables. Also describe web browsers generally and Internet technologies such HTTP protocol, or structure MIME. Next part of this document is describing Squeak Smalltalk system and it's graphic environment Morphic.
238

Distribuční systém pro elektronické obchody / Distribution System for e-Shops

Gavenda, Martin January 2008 (has links)
The aim of my thesis is to create a model of information system for internet stores support. Together with creating this model I will also analyse the possible use of new technologies such as AJAX or communication with the help of XML. The final work will consist of two applications, where first is a catalogue of products (goods) and the second is an internet store. The main application will provide services for this store.
239

Automatický generátor komiksů / Automatic Comics Generator

Holec, Pavel Unknown Date (has links)
The subject of this thesis is an implementation of program that enables automatic generation of a comics page from input comics text. It also discusses various techniques of image processing and implementation of graphical filters that can transform a photograph into a drawing or another artistic picture.
240

Generátor a správce jednoúčelových webových stránek / Generator and Administrator of Dedicated Web Pages

Vřesňák, Pavel Unknown Date (has links)
The focus of this Master Thesis is creation of a tool for generation of photo galleries, both for personal presentation and as an internet portal for interested exhibitors. The software is aimed at users with no or minimal experience with creating internet applications. The UML language was used for the design, while the web part of the application was implemented with help of the PHP, mySQL, JavaScript and CSS. User interface under the win32 was realised using Microsoft Visual C#. The work is concluded by brief analysis of realistic benefits users get from using the application.

Page generated in 0.0345 seconds