• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 27
  • 16
  • 6
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 133
  • 64
  • 36
  • 26
  • 26
  • 22
  • 20
  • 18
  • 17
  • 15
  • 15
  • 12
  • 12
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Effects and opportunities of native code extensions for computationally demanding web applications

Jarosch, Dennis 17 January 2012 (has links)
Das World Wide Web befindet sich im Wandel von interaktiven Webseiten hin zu Web- Applikationen. Eine steigende Zahl von Anwendern führt täglich Aufgaben ausschließlich mit Hilfe des Web-Browsers durch. Dadurch wird das Web zu einer bedeutenden Plattform für Anwendungsentwicklung. Dieser Plattform fehlt jedoch heute noch die Rechenleistung nativer Applikationen. Microsoft Xax und Google Native Client (NaCl) sind zwei neue, unabhängige Technologien zur Entwicklung nativer Web-Applikationen. Sie ermöglichen die Erweiterung herkömmlicher Web-Applikationen durch kompilierten nativen und dennoch betriebssystemunabhängigen Programmcode. Die vorliegende Dissertation untersucht die Vor- und Nachteile nativer Web-Applikationen und analysiert zudem das tatsächliche Leistungsvermögen im Vergleich zu konventionellen JavaScript Web-Applikationen. Dazu wird eine experimentelle Leistungsanalyse von nativen Applikationen in C, JavaScript Web-Applikationen und NaCl nativen Web-Applikationen anhand vier unterschiedlicher Vergleichstests durchgeführt. Dabei werden die folgenden Leistungsaspekte betrachtet: mathematische Operationen (seriell und parallel), 3D-Grafikoperationen und Datenverarbeitung. Die Ergebnisse der Leistungsanalyse zeigen, dass NaCl Stärken in mathematischen und 3D-Grafikoperationen zu Grunde liegen, jedoch erhebliche Schwächen bei der Datenverarbeitung aufweist. Entsprechende Lösungsansätze zur Optimierung der Anwendung werden erarbeitet und erörtert. Eine Bewertung anhand technischer und nicht-technischer Kriterien komplementiert die Ergebnisse der Leistungsanalyse. Darüber hinaus werden die technischen, politischen und strategischen Treiber für NaCls Marktdurchdringung diskutiert. / The World Wide Web is amidst a transition from interactive websites to web applications. An increasing number of users perform their daily computing tasks entirely within the web browser, turning the Web into an important platform for application development. The Web as a platform, however, lacks the computational performance of native applications. This problem has motivated the inception of Microsoft Xax and Google Native Client (NaCl), two independent projects that facilitate the development of native web applications. These allow the extension of conventional web applications with compiled native code, while maintaining operating system portability. This dissertation determines the benefits and drawbacks of native web applications. It also examines the actual performance capabilities of JavaScript web applications. An experimental performance analysis is undertaken in order to determine and compare the performance characteristics of native C applications, JavaScript web applications, and NaCl native web applications. Four application benchmarks consider different performance aspects: number crunching (serial and parallel), 3D graphics performance, and data processing. The results confirm that NaCl''s performance in computational tasks and 3D graphics is impeccable. On the other hand, it shows substantial limitations in data processing. These are evaluated and possible solutions are discussed. The results of the performance analysis are complemented with an evaluation on the basis of technical and non-technical criteria and a discussion of the technical, political, and strategic drivers for NaCl.
112

Získávání informací o uživatelích na webových stránkách / Browser and User Fingerprinting for Practical Deployment

Vondráček, Tomáš January 2021 (has links)
The aim of the diploma thesis is to map the information provided by web browsers, which can be used in practice to identify users on websites. The work focuses on obtaining and subsequent analysis of information about devices, browsers and side effects caused by web extensions that mask the identity of users. The acquisition of information is realized by a designed and implemented library in the TypeScript language, which was deployed on 4 commercial websites. The analysis of the obtained information is carried out after a month of operation of the library and focuses on the degree of information obtained, the speed of obtaining information and the stability of information. The dataset shows that up to 94 % of potentially different users have a unique combination of information. The main contribution of this work lies in the created library, design of new methods of obtaining information, optimization of existing methods and the determination of quality and poor quality information based on their level of information, speed of acquisition and stability over time.
113

Miljöpåverkan från efterbehandling av förorenade områden : En livscykelanalys av schaktsanering ur ett klimat- och resurshanteringsperspektiv / Environmental impact from remediation of contaminated areas

Oleskog, Astrid January 2023 (has links)
I Sverige finns det idag cirka 86 000 inventerade områden som är eller misstänks vara förorenade. Ett förorenat område kan ha en skadlig effekt på människor, djur och miljö vid exponering av föroreningarna. Områden som är förorenade kan därför behöva behandlas för att lokalt förbättra markkvalitén och för att minska risker. Ett problem som börjat uppmärksammas i branschen är att saneringar av mark också kan leda till betydande negativa konsekvenser som till exempel utsläpp av växthusgaser och nyttjande av fossila resurser. Bland annat riskeras det svenska miljömålet ”Begränsad klimatpåverkan” att inte uppnås om ingenting förändras. Den vanligaste metoden för att efterbehandla ett förorenat område i Sverige är genom schaktsanering. Studiens syfte var att undersöka klimatpåverkan och resurshanteringen från schaktsanering genom att göra en livscykelanalys på metoden. Metodens klimatpåverkan jämfördes också med andra saneringsmetoder. Resultaten visade att för schaktsanering bidrog transport och deponiarbetet till den största klimatpåverkan och resursanvändningen. Genom minskade transportavstånd, användning av fordon med lägre energiåtgång per transportarbete eller utbyte mot mer förnyelsebara drivmedel sågs miljöpåverkan minska. Deponering av massor sågs dessutom vara energikrävande, varpå incitament för att återvinna och återanvända massor i högre utsträckning än vad det görs idag efterfrågas. I jämförelse med andra saneringsmetoder sågs biokol vara den metod som gav upphov till en relativt liten klimatpåverkan. Dessutom medförde biokol en mer resurseffektiv avfallshantering i och med minskad deponering av jord, organiskt avfall och uttag av jungfruliga råvaror för återfyllnad. / In Sweden, there are currently approximately 86,000 inventoried sites that are identified as being contaminated. A contaminated site can have a harmful effect on humans, animals and the environment when exposed to the contaminants. Contaminated areas may therefore need to be remediated to locally improve soil quality and to reduce risks. A problem that has been noticed in the industry is that remediation of land can also lead to significant negative environmental consequences, such as the release of greenhouse gases and the use of fossil resources. For example, the Swedish environmental quality objective "Reduced climate impact" might not be achieved unless there are improvements. The most common method to remediate a contaminated site in Sweden is through “dig and dump”. The purpose of this study was to investigate the climate impact and resource usage from “dig and dump” by performing a life cycle assessment of this most common remediation method. The climate impact of the method was also compared with other remediation methods. The results showed that for “dig and dump”, transports and landfill of the soil contributed to the greatest climate impact and resource use. Through reduced transport distances, use of vehicles with lower energy consumption or exchange for more renewable fuels, the environmental impact was reduced. Landfilling of excavated soil was also energy demanding, and incentives to recycle and reuse soil to a greater extent than is done today are preferred. In comparison with other remediation methods, biochar was a method that caused a relatively small climate impact. In addition, biochar led to a more resource-efficient waste management through reduced disposal of soil, organic waste, and extraction of virgin raw materials for refilling.
114

Undersökning av webbsidors säkerhet vid användning avFacebook Login : Vidareutveckling och analys av OAuthGuard

Hedmark, Alice January 2019 (has links)
Single Sign-On (SSO) är en autentiseringsprocess som tillåter en utvecklare att delegera autentiseringsansvaret till en dedikerad tjänst. OAuth 2.0 är ett auktoriseringsramverk som ofta står som grund för ett autentiseringslager som i sin tur möjliggör SSO. En identitetsleverantör är tjänsten som står för hantering av användaruppgifterna och autentiseringen, två vanliga identitetsleverantörer är Google och Facebook som i sin tur implementerar SSO med hjälp utav autentiseringslagren OpenID Connect respektive Facebooks egna autentiseringslager. Det har visat sig att många klienter som ska utnyttja SSO med OAuth 2.0 implementerar det fel så att säkerhetsbrister uppstår, studier har utförts med förslag till lösningar men många bristande implementationer fortsätter produceras och existera. Att skapa diverse verktyg för att främja säkerhet i dessa sammanhang är en metod där OAuthGuard utvecklats med visionen att även kunna skydda användaren, direkt från en webbläsare. OAuthGuard har även tidigare använts för att analysera säkerheten med Google SSO och visat att 50% av undersökta klienter har brister, men motsvarande studie eller verktyg saknas för Facebook SSO. Denna studie gjorde en motsvarande undersökning för Facebook SSO-klienter med en vidareutvecklad version av OAuthGuard och fann att de lider av brister med liknande trend som tidigare studies resultat mot Google-SSO-klienter, men att färre Facebook- SSO-klienter har brister i jämförelse. Vid vidareutvecklingen av OAuthGuard upptäcktes ett antal svårigheter och framtiden för denna typ av verktyg behöver vidare analyseras. Vidare analys behöver även göras för att bedöma om Facebook-SSO kan vara att föredra över Google-SSO ur säkerhetsperspektiv samt vidare utforskande av nya säkerhetsfrämjande metoder behöver utföras. / Single Sign-On (SSO) is an authentication process that allows a developer to delegate the authentication responsibility to a dedicated service. OAuth 2.0 is an authorization framework that often serves as a base for authentication layers to be built upon that in turn allows for SSO. An identity provider is the service that is responsible for handling user credentials and the authentication, two common identity providers are Google and Facebook that implement SSO with the authentication layers OpenID Connect respectively Facebooks own authentication layer. It has been shown that many clients using OAuth 2.0 as base for SSO make faulty implementations leading to security issues, a number of studies has proposed solutions to these issues but faulty implementations are continually being made. To create various tools to promote security in these contexts is a method where OAuthGuard has been developed with the vision to also directly protect the common website user directly from the browser. OAuthGuard has been used in an earlier study to analyze the security of clients using Google SSO and discovered that 50% of the analyzed clients had flaws, no comparable study has been done for clients using Facebook SSO, which is the second largest third party log in variant. This study made a comparable investigation for Facebook SSO clients with a further developed version of OAuthGuard and found that these clients suffer from flaws with a similar trend as the previous study with Google-SSO clients, although fewer Facebook-SSO clients suffer from these flaws. When further developing OAuthGuard a dumber of difficulties was discovered and the future of these kind of tools needs to be investigated. Further analysis needs to be done to assess if Facebook-SSO should be recommended over Google-SSO from a security perspective and also further exploration of new methods to promote security needs to be done.
115

Requesting Utility in Usability -Perspectives from a large team software engineering project

Heinstedt, Elin, Johansson, Niklas January 2001 (has links)
Many companies invest large amount of money in developing new technology, without knowing how it will be used. To succeed in making these technologies useful it is necessary to understand the context that gives meaning to the artifact. In the case of generic products, especially in new domains, the context is not obvious. This bachelor thesis analyses what Usability Engineering, Participatory Design and Ethnography can contribute to the problem of learning about the context of usage for generic artifacts. Understanding and identifying details of context is considered to be important to achieve usability in software development. The experience is that most recommendations on usability methods concern situations of specific users in a specific context. In order to find important aspects of the real-world use of generic products, we suggest that ethnographic studies can be conducted in contexts where behaviors relevant to the design are thought to be found. The problem of not knowing the context was experienced in usability work practiced in a large software engineering project. The project task was to develop a web browser for Symbian's ?Quartz? reference design for handheld devices. Methods used were taken from participatory design and usability engineering.
116

Vem använder lösenordshanterare? : En undersökning av demografiska variablers påverkan på användning av lösenordshanterare

Andersson, Markus, Vilmusenaho, Viktor January 2020 (has links)
Lösenordshanterare har länge varit tillgängliga och det finns mycket forskning som tyder på att användningen av dem är begränsad. Deras funktionalitet hjälper användaren att generera och spara unika och starka lösenord för varje individuell inloggning. Vi utformar en enkät med hjälp av tidigare forskning och en modifierad version av teknikacceptansmodellen i syfte att undersöka demografiska variablers påverkan på användningen av lösenordshanterare. Undersökningen bedrivs genom att kvantitativ data samlas in från den digitala plattformen reddit.com. Denna data analyseras därefter med hjälp av statistiska metoder, där vi kommer fram till att det finns signifikanta skillnaderivariablernakön,geografiskplats,antalunikalösenordochdatorvana. Dessa variabler påverkade både den faktiska användningen, men också attityden till systemet. Vi diskuterar detta resultat utifrån den presenterade teorin och relaterad forskning. / Password managers have been available for a long time, and there has been a lot of research showing that these tools are not commonly used. Their functionality helps the usertobothgenerateandsaveuniqueandstrongpasswordsforeachindividualauthentication online. We conduct a quantitative investigation where we create a survey based on related research and a modified version of the Technology Acceptance Model. The dataforourquantitativeanalysisweregatheredbypublishingasurveyontheplattform Reddit.com. Thisdatawerethereafteranalysedusingstatisticalmethods,whereanumber of statistically significant differences were found. We found that gender, geographic location,amountofuniquepasswordsandcomputerprofiencyallhadsignificanteffects on either the actual system use or on the attitude towards the system. These results are evaluated by relating them to the presented theories and related research.
117

Prohlížečová hra s umělou inteligencí / Browser Game with Artificial Intelligence

Moravec, Michal January 2019 (has links)
Thesis describes design and implementation of a web browser game, which can be played by multiple players via the internet. The main goal is to manage the economy, although players can cooperate (trading) or play against each other (battles). NoSQL database is used for persistent storage of progress, which is also described in the thesis. Apart from human players there are also agents/bots, which play the game autonomously via state machines generated by genetic algorithms. Paper describes design and functionality of either the genetic algorithms, but also the state machines.
118

Zobrazení 3D scény ve webovém prohlížeči / Displaying 3D Graphics in Web Browser

Sychra, Tomáš January 2013 (has links)
This thesis discusses possibilities of accelerated 3D scene displaying in a Web browser. In more detail, it deals with WebGL standard and its use in real applications. An application for visualization of volumetric medical data based on JavaScript, WebGL and Three.js library was designed and implemented. Image data are loaded from Google Drive cloud storage. An important part of the application is 3D visualization of the volumetric data based on volume rendering technique called Ray-casting.
119

Modern Methods for Tree Graph Structures Rendering / Modern Methods for Tree Graph Structures Rendering

Zajíc, Jiří January 2013 (has links)
Tento projekt se věnuje problematice zobrazení velkých hierarchických struktur, zejména možnostem vizualizace stromových grafů. Cílem je implementace hyperbolického prohlížeče ve webovém prostředí, který využívá potenciálu neeukleidovské geometrie k promítnutí stromu na hyperbolickou rovinu. Velký důraz je kladen na uživatelsky přívětivou manipulaci se zobrazovaným modelem a snadnou orientaci.
120

Změny dokumentu v editoru anotací / Document Modifications in Annotation Editor

Cudrák, Miloš January 2014 (has links)
This thesis deals with the design and implementation of the document modifications and another annotation editor improvements developed as the part of the Decipher project. Explains the nature of the Decipher project and the inclusion of annotation system 4A in this project. It examines the annotation editor and propose solutions to problems and adding new functionality which makes it easier to work with annotations and also with editor itself.

Page generated in 0.053 seconds