• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 174
  • 158
  • 138
  • 13
  • 8
  • 7
  • 7
  • 4
  • 4
  • 4
  • 3
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 547
  • 215
  • 169
  • 124
  • 119
  • 98
  • 97
  • 93
  • 92
  • 84
  • 79
  • 74
  • 67
  • 63
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

Web interface for a multi-purpose transmitter

Cederlöf, Elin, Mattsson, Maximilian January 2023 (has links)
The project described in this report aims to construct a web interface for a multi-purpose transmitter (MPT). The MPT is a submodule that is meant to be used in future chromatography systems. This bachelor thesis project is done at Cytiva in Uppsala. The MPT uses the Azure RTOS development suite with ThreadX as the RTOS, FileX as the file system, and NetX for the TCP/IP protocol stack implementation. The Azure RTOS platform facilitates development for deeply embedded systems and comes with extensive documentation for their services. The web interface consists of 5 web pages with 5 different use cases: Overview for visualization of basic data; Data for visualization, recording, and filtering of all data; Update page for firmware update and file management; Log for development logging purposes; and Config for modifying parameters and running module diagnostic tests. Development of the front end was done in Visual Studio Code. The workflow was streamlined by developing a tool at the beginning of the project that combines the files for the front-end code into variables stored in C source files. Testing of new features was aided by a Python Flask server developed in parallel to the main back end. It embodies the same functionality as the main back end, aside from being hosted on a local computer. The web interface will mainly be used by customers and service personnel. This requires the interface to have two different access levels. The Overview page can be accessed without any authorization while the other pages require the user to be authenticated via username and password. / Vid tillverkning och forskning inom Life Science krävs det väldigt noggranna och välkalibrerade instrument. Dessa instrument behöver också underhållas, dels då de riskerar att gå sönder och för att säkerställa att de är korrekt kalibrerade. För att underlätta servicearbetet av Life Science-instrument har vi i detta projekt utvecklat ett webbgränssnitt.Ett webbgränssnitt är en hemsida som är möjlig att nå i en webbläsare för att kunna interagera med en modul. Arbetet har utförts på Cytiva, som är ett världsledande företag inom Life Science. Projektet består till viss del av utvärdering av alternativa lösningar, detta inkluderar val av dataformat, kommunikationsprotokoll, bibliotek, ramverk och metod för datauppdatering. Denna utvärdering baserades huvudsakligen på följande aspekter: minnesavtryck, robusthet och prestanda. Webbservern är skriven i programmeringsspråket C och använder sig av utvecklingsplatformen Azure RTOS. Det innehåller ett realtidsoperativsystem ThreadX, ett filhanteringssystem FileX och en nätverksstack NetX.  För att lättare kunna testa uppdateringar skapade vi en utvecklingsserver parallellt med vår webbserver i C. Denna server är skriven i Python med hjälp av ramverket Flask. Den har samma funktionalitet som MPT webbservern och kan köras lokalt på datorn.  Slutliga webbgränssnittet har fem olika sidor med fem olika funktioner: Overview för att visualisera grundläggande data, Data för att visualisera och filtrera all data på modulen, Update för filhantering och för att kunna uppdatera modulens mjukvara, Log för loggning vid utveckling och slutligen Config för att kunna justera parametrar och utföra diagnostik på modulen. Det som visas i webbläsaren, front end, är skrivet i programmeringsspråken HTML, CSS och JavaScript. Användandet av ramverk och bibliotek utvärderades men slutsatsen av det är att behovet för det inte övervägde ökningen i minnesavtryck. Koderna från de olika språken vävdes samman med hjälp av ett verktyg som utvecklades för att underlätta skapandet av gränssnittet. Verktyget är skrivet i Python och gör utvecklingen både snabbare och enklare genom att möjliggöra front-end utveckling i programutvecklingsmiljön Visual Studio Code.
342

Evaluating Blazor WebAssembly for the Progressive Web Application Front-End : A Comparative Study Using ReactJS as a Baseline

Rashidi, Vida, Segelström, William January 2021 (has links)
This study is conducted to evaluate the Blazor WebAssembly framework for the Progressive Web Application (PWA) methodology. A comparative study is conducted with a ReactJS PWA as a baseline. The two frameworks are evaluated in their front-end performance and documentation of PWA-focused subjects. Front-end performance is measured between two experimental applications that test the loading times, heap memory usage and loading consistency during layout generation. It is found that a Blazor WebAssembly PWA takes on average a range of 0.34, 0.18, and 0.06 seconds less time to generate a layout than a ReactJS PWA. The Blazor WebAssembly PWA was less consistent in its loading times when handling many elements. Documentation that covers Progressive Web Application terminology was found to be different between the frameworks. The Blazor WebAssembly documentation covers more topics and provides first-hand knowledge while the ReactJS documentation covers fewer topics and relies on external sources to provide the necessary explanations. These findings indicate that Blazor WebAssembly is a faster framework when updating large amounts of elements in comparison to ReactJS. However, the ReactJS PWA was found to be overall more consistent in its loading times. Documentation varied between the two frameworks. Documentation in Blazor WebAssembly covers more Progressive Web Application subjects and is more in-depth than ReactJS.  This study only evaluates applications developed in ReactJS and Blazor WebAssembly. Both are tested on Google Chrome in a desktop environment.
343

En jämnförelse av prestanda och skalbarhet för grafgenerering i datavisualiserande Javascript-bibliotek : Ett jämnförande experiment på Chart.js, ApexCharts, Billboard, och ToastUI / A comparision of performance and scalability of chart generation for Javascript data visualisation libraries : A comparative experiment on Chart.js, ApexCharts, Billboard, and ToastUI

Magnusson Millqvist, Hamlet, Bolin, Niklas January 2022 (has links)
On the web, data visualisation through charts and diagrams can help present data in a more readable way. This is often done through the usage of JavaScript libraries. We experimented with 5 JavaScript data visualisation libraries to determine their respective performances and how each one scaled with increased data size. Our results will hopefully provide help with the selection of said libraries. The results show a significant difference in response times between all libraries for mostdata sizes, with only a few exceptions. Different exponential growths were also identified for all libraries, and the performance often varied greatly depending on chart type. Response time is not the only variable in performance measurements. Future research could cover other aspects, like memory consumption and rendering requirements. There were also times when the libraries did not render at larger data sizes, despite showing no errors, and further investigation behind this should be done.
344

A performance investigation into JavaScript visualization libraries with the focus on render time and memory usage : A performance measurement of different libraries and statistical charts

Boström, Fredrik, Dahlberg, Alexander, Linderoth, William January 2022 (has links)
Visualizing data is important to make it easier to understand. Due to the common accessibility and the popularity of the web, a growth within web-based visualization is seen. One common and easy way to do this is by using an already implemented JavaScript library. When developing such a website some important properties to keep in mind are the render time and the memory usage. This study measured both the render time with different sizes of the dataset, and the render time when rendering different number of charts with the same dataset size. The memory usage was also measured with different sizes of the dataset. Six libraries were chosen to be included inthis study: D3, Echarts, CanvasJS, Chartist, Highcharts, and Plotly. An experiment has been performed to test the libraries render times and memory usage. The results of the experiment show that D3 has the overall lowest render times whilst CanvasJS had the lowest memory usage.
345

Developing the next generation of drones for water monitoring : Implementation of the User Interface (UI) of an internal website

Alsaudi, Omar, Tallozy, Yaman Mahmoud January 2022 (has links)
This report is about implementing Graphical User Interface for the CatFish website. The CatFish project is iterative research on water pollution where samples from water bodies aremonitored and collected using three different vehicles. The authors of this report, the frontend team, have created a website that aims to represent collected data from the vehicles in the form of diagrams and charts. It also shows live video streaming and gives the CatFish team the ability to control the vehicles remotely. Our results have shown that the website is functional, user-friendly, and ready to be hosted and used. / Den här rapporten handlar om att implementera grafiskt användargränssnitt för CatFish-webbplatsen. CatFish-projektet är iterativ forskning om vattenföroreningar där prover frånvattenförekomster övervakas och samlas in med tre olika fordon. Författarna till denna rapport, frontend-teamet, har skapat en webbplats som syftar till att representera insamlade data från fordonen i form av olika typer av diagram, visa live-videoströmning och ge CatFish-teametmöjligheten att fjärrstyra fordonen. Våra resultat har visat att webbplatsen är funktionell, användarvänlig och redo att hostas och användas.
346

Utveckling av en API-Hubb : Django REST Framework och React

Harnesk, Lukas January 2022 (has links)
Idag hämtar anställda på Bredband2 data från olika databaser på ett interntsystem som kallas en API-Hubb.  Denna API-Hubb samlar all informationsom kunder, telefonnummer, tjänster och kunders personliga data genomAPI (Application Programming Interface) förfrågningar.  Denna API-Hubbär begränsad i sin funktionalitet och ägs idag av ett externt företag som tarbetalt vid anslutningar av nya tjänster.  Det här arbetet avser att skapa enAPI-Hubb som utökar denna funktionaliteten genom att komplettera denparallellt eller att ersätta den.För att göra det behöver ett Python-baserat webbramverk användas som geren bra grund för att uppfylla arbetsgivarens krav samt erbjuda effektiva ochfunktionella  lösningmöjligheter.   De  funktioner  som  finns  på  existerandeAPI-Hubb  ska  få  utökade  anpassningsmöjligheter  genom  dynamisk  kodsom är skalbart och fungerar oavsett vilken funktion som ska utföras.Funktionalitet  och  utveckling  kommer  att  ske  till  stor  del  i  Django  RestFramework med programmeringsspråket Python.  För funktioner i använ-dargränssnittet kommer React att användas.  En uppgift är att få React ochDjango Rest Framework att interagera samt att arbetet ska använda sig avPostgreSQL som databas.Arbetet startar med att göra förundersökningar över vilket webbramverksom  ska  användas.   Django  Rest  Framework  valdes  och  installation  ochgrundkonfiguration  av  Django  startades.   När  den  grundliga  konfigura-tionen  ägt  rum  så  börjar  utvecklandet  av  API  endpoints,  models,  viewsoch funktioner för testmiljö.  Det bestämdes olika utföranden för GET för-frågningar och POST förfrågningar riktade mot olika endpoints. Resultatenvisar på att det är fullt möjligt att utveckla en API-Hubb med utökad funk-tionalitet  och  anpassningsmöjligheter  med  Django  React  Framework  ochReact genom att skapa en dynamisk kod som arbetar mot flera funktionerberoende på vilken indata som hanteras. / Today, employees at Bredband2 retrieve data from various databases on aninternal system called an API-Hub.  This API-Hub collects all informationsuch  as  customers,  telephone  numbers,  services  and  customers  personaldata through API (Application Programming Interface) requests. This API-Hub is limited in its functionality and is currently owned by an externalcompany that charges for connections of new services.  This project is in-tended to create an API Hub that expands this functionality by supplement-ing or replacing it.To do so, a Python-based web framework must be used which provides agood basis for meeting the foreman’s requirements as well as offering effi-cient and functional solution options.  The functions available on the exist-ing API-Hub will have extended customization options through dynamiccode that is scalable and works regardless of which function is to be per-formed.Functionality and development will take place in the Django Rest Frame-work with the programming language Python. React will be used too han-dle  functions  in  the  user  interface.   One  of  the  tasks  is  to  get  React  andDjango Rest Framework to interact and that PostgreSQL is used as the database.The work starts with doing preparatory research on which web frameworkis to be used.   Django Rest Framework was selected and installation andbasic configuration of Django was started.   Once the thorough configura-tion had taken place, the development of API endpoints, models, views andfunctions for the test environment began.  Different rules were determineddepending on whether it was a GET request or a POST request, which weresent to different endpoints.  The results showed that it is entirely possibleto develop an API-Hub with extended functionality and customization op-tions with Django React Framework and React by creating a dynamic codethat works towards several functions depending on which input data is han-dled.
347

A visualization interface for spatial pathway regulation data

Zhang, Yang January 2018 (has links)
Data visualization is an essential methodology for bioinformatics studies. Spatial Transcriptomics(ST) is a method that aims at measuring the transcriptome of tissue sections while maintaining its spacial information. Finally, the study of biological pathway focuses on a series of biochemical reactions that take place in organisms. As these studies generate a large number of datasets, this thesis attempts to combine the ST’s data with pathwayinformation and visualize it in an intuitive way to assist user comprehension and insight.In this thesis, Python was used for integrating the dataset and JavaScript libraries wereused for building the visualization. The processing of ST pathway data together with the data visualization interface are the outcomes of this thesis. The data visualization can show the regulation of pathways in the ST data and can be accessed by modern browsers. These outcomes can help users navigate the ST and pathway datasets more effectively. / Datavisualisering är en viktig del av bioinformatik. Spatial transkriptomik (ST) är en metod som mäter transkriptom, samtidigt som den behåller spatial information. Biologiskapathways å andrasidan fokuserar på biokemiska reaktioner som sker inom organismer. Dessa studier genererar mycket data, och denna avhandling försöker att kombinera ST-data med pathway information och få en intuitiv visualisering av det integrerade datat.I avhandlingen användes Python för att integrera datat och JavaScript bibliotek för attbygga visualiseringsverktyget. Avhandlingen resulterade i en metod för att integrera STdata och pathway information, samt ett visualiseringsverktyg för ovan nämnda information.Verktyget kan visa pathway regulationer i ST data och kan användas i moderna webbläsare.Forskningen resulterade i ett verktyg som kan hjälpa forskare att förstå ST och pathwaydata.
348

TYPED VS UNTYPED PROGRAMMING LANGUAGES

Bni, Ouail, Matusiak, Artur Kamil January 2022 (has links)
TypeScript (TS) has been growing in popularity since its release in October 2012. It is beingadopted by many tech companies who specialize in web development. However, migrating old JSprojects to TS can be challenging and time consuming which can prove problematic due to limitedtime at the developers’ disposal.The aim of this study is to find out what the benefits of using TS over JS are, and their relationto Software Sustainability. By developing a migration tool that helps with automating the migrationprocess we investigate if such a tool would bring benefits to Axis Communications AB; especially,their web developers who work with JS and TS.After the development of the artifact, a focus group consisting of six experienced web developersand one tester was invited to a workshop in order to evaluate it. The workshop consisted of threeparts: Five open questions, a demo presentation of the artifact, and an artifact evaluation basedon the system dimensions: goal, environment, structure, activity, and evolution, by using the fivepoints Likert scale.Results from the workshop allowed us to understand better the needs and the challenges faced byAxis developers during software maintenance. Furthermore, the results indicated that the artifactnot only helps with improvements in terms of code maintenance but also indirectly improves codegreenability which in itself lowers CO2 emissions as a result.Migrating JS to TS improves some aspects of code maintenance and maintainability, and ourartifact helps with the automation of that migration process. With JavaScript, companies have tobalance sustainability with greenability. Fortunately, by adding green maintenance practices andusing TypeScript it is easier to keep that stability.
349

DependencyVis: Helping Developers Visualize Software Dependency Information

Lui, Nathan 01 June 2021 (has links) (PDF)
The use of dependencies have been increasing in popularity over the past decade, especially as package managers such as JavaScript's npm has made getting these packages a simple command to run. However, while incidents such as the left-pad incident has increased awareness of how vulnerable relying on these packages are, there is still some work to be done when it comes to getting developers to take the extra research step to determine if a package is up to standards. Finding metrics of different packages and comparing them is always a difficult and time consuming task, especially since potential vulnerabilities are not the only metric to consider. For example, considering how popular and how actively maintained the package is also just as important. Therefore, we propose a visualization tool called DependencyVis that is specific to JavaScript projects and npm packages as a solution by analyzing a project's dependencies in order to help developers by looking up the many basic metrics that can address a dependency's popularity, activeness, and vulnerabilities such as the number of GitHub stars, forks, and issues as well as security advisory information from npm audit. This thesis then proposes many use cases for DependencyVis to help users compare dependencies by displaying the dependencies in a graph with metrics represented by aspects such as node color or node size.
350

Detection, Triage, and Attribution of PII Phishing Sites

Roellke, Dennis January 2022 (has links)
Stolen personally identifiable information (PII) can be abused to perform a multitude of crimes in the victim’s name. For instance, credit card information can be used in drug business, Social Security Numbers and health ID’s can be used in insurance fraud, and passport data can be used for human trafficking or in terrorism. Even Information typically considered publicly available (e.g. name, birthday, phone number, etc.) can be used for unauthorized registration of services and generation of new accounts using the victim’s identity (unauthorized account creation). Accordingly, modern phishing campaigns have outlived the goal of account takeover and are trending towards more sophisticated goals. While criminal investigations in the real world evolved over centuries, digital forensics is only a few decades into the art. In digital forensics, threat analysts have pioneered the field of enhanced attribution - a study of threat intelligence that aims to find a link between attacks and attackers. Their findings provide valuable information for investigators, ultimately bolster takedown efforts and help determine the proper course of legal action. Despite an overwhelming offer of security solutions today suggesting great threat analysis capabilities, vendors only share attack signatures and additional intelligence remains locked into the vendor’s ecosystem. Victims often hesitate to disclose attacks, fearing reputation damage and the accidental revealing of intellectual property. This phenomenon limits the availability of postmortem analysis from real-world attacks and often forces third-party investigators, like government agencies, to mine their own data. In the absence of industry data, it can be promising to actively infiltrate fraudsters in an independent sting operation. Intuitively, undercover agents can be used to monitor online markets for illegal offerings and another common industry practice is to trap attackers in monitored sandboxes called honeypots. Using honeypots, investigators lure and deceive an attacker into believing an attack was successful while simultaneously studying the attacker’s behavior. Insights gathered from this process allow investigators to examine the latest attack vectors, methodology, and overall trends. For either approach, investigators crave additional information about the attacker, such that they can know what to look for. In the context of phishing attacks, it has been repeatedly proposed to "shoot tracers into the cloud", by stuffing phishing sites with fake information that can later be recognized in one way or another. However, to the best of our knowledge, no existing solution can keep up with modern phishing campaigns, because they focus on credential stuffing only, while modern campaigns steal more than just user credentials — they increasingly target PII instead.We observe that the use of HTML form input fields is a commonality among both credential stealing and identity stealing phishing sites and we propose to thoroughly evaluate this feature for the detection, triage and attribution of phishing attacks. This process includes extracting the phishing site’s target PII from its HTML <label> tags, investigating how JavaScript code stylometry can be used to fingerprint a phishing site for its detection, and determining commonalities between the threat actor’s personal styles. Our evaluation shows that <input> tag identifiers, and <label> tags are the most important features for this machine learning classification task, lifting the accuracy from 68% without these features to up to 92% when including them. We show that <input> tag identifiers and code stylometry can also be used to decide if a phishing site uses cloaking. Then we propose to build the first denial-of-phishing engine (DOPE) that handles all phishing; both Credential Stealing and PII theft. DOPE analyzes HTML <label> tags to learn which information to provide, and we craft this information in a believable manner, meaning that it can be expected to pass credibility tests by the phisher.

Page generated in 0.0498 seconds