• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 13
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Främjande av inklusiva webbupplevelser : En jämförelsestudie av automatiserade tillgänglighetstestverktyg i en E2E-integrerad mjukvaruprocess

Engström, Angelica January 2023 (has links)
Att skapa inkluderande webbupplevelser innebär att användare, oavsett förutsättningar, ska kunna uppfatta, förstå sig på, interagera med och kunna bidra till tjänster på webben. För att arbeta med webbtillgänglighet finns uppsatta standarder för att upptäcka och minimera begränsningar. Trots uppsatta standarder tillhandahåller flertalet myndigheter otillgängliga tjänster, vilket sägs bero på inkompetens och resursbrist. Studiens mål är därmed att bidra med kunskap om automatiserade tillgänglighetstestverktygs effektivitet. Verktygen är implementerade inom och för mjukvaruprocesser och har integrerats med ett End-To-End-testverktyg. Studiens mål uppnås med hjälp av en kvantitativ datainsamling inspirerad av Brajniks definition av effektivitet som mäter verktygens exekveringstid, fullständighet, specificitet och korrekthet. Utifrån mätningar på 4 av World Wide Web Consortiums demonstrationswebbsidor baserat på WCAG 2.1 enligt DIGGs tillsynsmanual visar studien att tillgänglighetstestverktygen Pa11y, QualWeb, IBM Equal Access och Google Lighthouse presterar bättre och sämre inom olika områden. Studien uppmärksammar att QualWeb i genomsnitt har kortast exekveringstid med ca 3933 millisekunder. QualWeb har även högst andel genomsnittlig fullständighet (80,94 %) och specificitet (58,26 %). Det verktyg med högst andel korrekthet är Google Lighthouse (99,02 %). Inget av verktygen anses perfekt eftersom samtliga verktyg gör felbedömningar. Studiens slutsats menar därmed att QualWeb är ett effektivare tillgänglighetstestverktyg som kräver komplettering av ytterligare testningsmetoder såsom manuella och användarcentrala tester. / Creating inclusive web experiences means that users, regardless of their circumstances, should be able to perceive, understand, interact with, and contribute to services on the web. To ensure web accessibility, there are established standards to detect and minimize limitations. However, despite these standards, most authorities provide inaccessible services, due to incompetence and resource constraints. The goal of the study is thus to contribute knowledge about automated accessibility testing tools effectiveness. These tools are implemented within and for software processes and are integrated with an End-To-End testing tool. The study achieves its goals through quantitative data collection inspired by Brajnik’s definition of efficiency, measuring the tools’ execution time, completeness, specificity, and correctness. Based on measurements taken from four of the World Wide Web Consortium’s demonstration websites, following WCAG 2.1 according to DIGG’s supervision manual, the study shows that the accessibility testing tools Pa11y, QualWeb, IBM Equal Access, and Google Lighthouse perform better and worse in different areas. The study highlights that QualWeb has the shortest average execution time at approximately 3933 milliseconds. QualWeb also has the highest average completeness (80.94%) and specificity (58.26%). The tool with the highest correctness rate is Google Lighthouse (99.02%). None of the tools are considered perfect, as all of them make mistakes. The study’s conclusion suggests that QualWeb is a more effective accessibility testing tool that requires additional testing methods such as manual, and user testing.
2

BXE2E: a bidirectional transformation approach for medical record exchange

Ho, Jeremy 25 April 2017 (has links)
Modern health care systems are information dense and increasingly relying on computer-based information systems. Regrettably, many of these information systems behave only as an information repository, and the interoperability between different systems remains a challenge even with decades of investment in health information exchange standards. Medical records are complex data models and developing medical data import / export functions a is difficult, prone to error and hard to maintain process. Bidirectional transformations (bx) theories have been developed within the last decade in the fields of software engineering, programming languages and databases as a mechanism for relating different data models and keeping them consistent with each other. Current bx theories and tools have been applied to hand-picked, small-size problems outside of the health care sector. However, we believe that medical record exchange is a promising industrial application case for applying bx theories and may resolve some of the interoperability challenges in this domain. We introduce BXE2E, a proof-of-concept framework which frames the medical record interoperability challenge as a bx problem and provides a real world application of bx theories. During our experiments, BXE2E was able to reliably import / export medical records correctly and with reasonable performance. By applying bx theories to the medical document exchange problem, we are able to demonstrate a method of reducing the difficulty of creating and maintaining such a system as well as reducing the number of errors that may result. The fundamental BXE2E design allows it to be easily integrated to other data systems that could benefit from bx theories. / Graduate / 0984
3

Proposta de um modelo para verificabilidade E2E no sistema eletrônico de votação brasileiro utilizando mecanismos de criptografia visual

Varejão Junior, Gleudson Pinheiro 21 August 2014 (has links)
Submitted by Luiz Felipe Barbosa (luiz.fbabreu2@ufpe.br) on 2015-03-09T14:26:19Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) DISSERTAÇÃO Gleidson Pinheiro Varejão Júnior.pdf: 5786318 bytes, checksum: db03ae4c22592a31990484cac439eb0d (MD5) / Made available in DSpace on 2015-03-09T14:26:20Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) DISSERTAÇÃO Gleidson Pinheiro Varejão Júnior.pdf: 5786318 bytes, checksum: db03ae4c22592a31990484cac439eb0d (MD5) Previous issue date: 2014-08-21 / A história relata a ascensão da democracia sendo seleta a pequenos grupos de uma população, como frequentemente ocorria em algumas nações. O Brasil é um país democrático que tem participação da sociedade, que exerce seu direito democrático através dos seus representantes. No entanto, evidências descritas por Diego Aranha em (ARANHA, 2014) apontam que a maneira pela qual eles são eleitos nem sempre atinge níveis aceitáveis de segurança e confiabilidade do processo. Desde então os Sistemas Eletrônicos de Votação (SEV) vêm sendo empregados em países como Holanda, Índia, Alemanha e Brasil, tendo como principal objetivo atender aos requisitos, propriedades, regras e leis instauradas para um sistema eleitoral, primando pela conformidade de padrões e preceitos democráticos específicos de cada nação. No Brasil, o início do processo de informatização das eleições ocorreu no ano de 1996, onde então foi apresentada ao mundo a aplicabilidade de um modelo de votação 100% eletrônico, e que, segundo autoridades responsáveis pelo processo, é apontado como seguro e isento à fraude. Desde então muitas discussões surgiram a respeito da segurança do mesmo. Um dos assuntos mais pautados entre profissionais e pesquisadores de áreas afins ao sistema é a impossibilidade de se realizar um processo chamado verificabilidade “fim-a-fim” (em inglês, “end-to-end”, abreviação E2E), que visa prover mecanismos que possibilitam a verificação do voto por parte do eleitor, muito em decorrência da inexistência de um mecanismo que viabilize a materialização do voto. Levando em consideração os relatos, torna-se latente a necessidade do vínculo entre a transparência e automação de recursos, mitigando os riscos na ocorrência de fraudes e maximizando as possibilidades de auditoria e recontagem dos votos. Dessa forma, a criptografia computacional vem mostrandose uma das principais ferramentas para atender demandas de segurança em SEV. Este trabalho visa estudar e avaliar os princípios de um SEV, bem como suas principais tecnologias e desafios de segurança. A partir do estudo realizado, é descrita a proposta de um modelo utilizando criptografia visual, a fim de prover possíveis mecanismos que atendam o requisito de verificabilidade E2E com a materialização do voto de um modo não tradicional, tendo como foco o emprego desse esquema no Sistema Eletrônico de Votação brasileiro.
4

Simulation fine d'optique adaptative à très grand champ pour des grands et futurs très grands télescopes

Chebbo, Manal 24 September 2012 (has links)
La simulation fine de systèmes d'OA à grand champ de type MOAO ou LTAO pour l'ELT se heurte à deux problématiques: l'augmentation du nombre de degrés de liberté du système. Cette augmentation rend les codes de simulation classiques peu utilisables, en particulier en ce qui concerne les processus d'inversion et de calcul matriciel. La complexité des systèmes, combinant EGL et EGN, grands miroirs déformables couvrant tout le champs et des miroirs dédiés dans les instruments eux mêmes, des rotations différentielles de pupille et ou de champs. Cette complexité conduit aux développements de procédures nouvelles d'étalonnage, de filtrage et fusion de données, de commande distribuée ou globale. Ces procédures doivent être simulées finement, comparées et quantifiées en termes de performances, avant d'être implantées dans de futurs systèmes. Pour répondre à ces deux besoins, le LAM développe en collaboration avec l'ONERA un code de simulation complet, basé sur une approche de résolution itérative de systèmes linéaires à grand nombre de paramètres (matrices creuses). Sur cette base, il incorpore de nouveaux concepts de filtrage et de fusion de données pour gérer efficacement les modes de tip/tilt/defocus dans le processus complet de reconstruction tomographique. Il permettra aussi, de développer et tester des lois de commandes complexes ayant à gérer un la combinaison du télescope adaptatif et d'instrument post-focaux comportant eux aussi des miroirs déformables dédiés.La première application de cet outil se fait naturellement dans le cadre du projet EAGLE, un des instruments phares du futur E-ELT, qui, du point de vue de l'OA combinera l'ensemble de ces problématiques. / Refined simulation tools for wide field AO systems on ELTs present new challenges. Increasing the number of degrees of freedom makes the standard simulation's codes useless due to the huge number of operations to be performed at each step of the AO loop process. The classical matrix inversion and the VMM have to be replaced by a cleverer iterative resolution of the Least Square or Minimum Mean Square Error criterion. For this new generation of AO systems, concepts themselves will become more complex: data fusion coming from multiple LGS and NGS will have to be optimized, mirrors covering all the field of view associated to dedicated mirrors inside the scientific instrument itself will have to be coupled using split or integrated tomography schemes, differential pupil or/and field rotations will have to be considered.All these new entries should be carefully simulated, analysed and quantified in terms of performance before any implementation in AO systems. For those reasons i developed, in collaboration with the ONERA, a full simulation code, based on iterative solution of linear systems with many parameters (sparse matrices). On this basis, I introduced new concepts of filtering and data fusion to effectively manage modes such as tip, tilt and defoc in the entire process of tomographic reconstruction. The code will also eventually help to develop and test complex control laws who have to manage a combination of adaptive telescope and post-focal instrument including dedicated DM.
5

Sledovaní současného stavu testovacích technik ve vybrané společnosti / Testing Techniques in Continuous Integration System

Shpak, Yuliia January 2020 (has links)
S vývojem informací, komunikací a technologií se moderní průmyslové řídicí systémy (ICS) potýkají čím dál více s otázkami automatického testováni pro zabezpečení stability a bezpečnosti systému. Z tohoto důvodu se testování stalo jednou z nejdůležitějších částí životního cyklu všech softwarů. V této diplomové práci budu zvažovat možnost využití stávajících zkušebních metod a nástrojů pro získání dostatečné jakosti a bezpečnosti softwaru v kontinuálních integračních systémech.
6

Mécanismes auto-organisants pour connexions de bout en bout / Self-organizing mechanisms for end-to-end connections

Floquet, Julien 19 December 2018 (has links)
Les réseaux de cinquième génération sont en cours de définition et leurs différentes composantes commencent à émerger: nouvelles technologies d'accès à la radio, convergence fixe et mobile des réseaux et virtualization.Le contrôle et la gestion de bout en bout (E2E) du réseau ont une importance particulière pour les performances du réseau. Cela étant, nous segmentons le travail de thèse en deux parties: le réseau d’accès radio (RAN) axé sur la technologie MIMO Massif (M-MIMO) et la connexion E2E du point de vue de la couche transport.Dans la première partie, nous considérons la formation de faisceaux focalisés avec un structure hiérarchique dans les réseaux sans fil. Pour un ensemble de flots donnée, nous proposons des algorithmes efficaces en terme de complexité pour une allocation avec alpha-équité. Nous proposons ensuite des formules exactes pour la performance au niveau du flot, à la fois pour le trafic élastique (avec une équité proportionnelle et équité max-min) et le trafic en continu. Nous validons les résultats analytiques par des simulations.La seconde partie de la thèse vise à développer une fonction de réseau auto-organisant (SON) qui améliore la qualité d'expérience (QoE) des connexions en bout-en-bout. Nous considérons un service de type vidéo streaming et développons une fonctionnalité SON qui adapte la QoE de bout-en-bout entre le serveur vidéo et l'utilisateur. La mémoire-tampon reçoit les données d'un serveur avec une connexion E2E en suivant le protocole TCP. Nous proposons un modèle qui décrit ce comportement et nous comparons les formules analytiques obtenues avec les simulations. Enfin, nous proposons un SON qui donne la qualité vidéo de sorte que la probabilité de famine soit égale à une valeur cible fixée au préalable. / Fifth generation networks are being defined and their different components are beginning to emerge: new technologies for access to radio, fixed and mobile convergence of networks and virtualization.End-to-end (E2E) control and management of the network have a particular importance for network performance. Having this in mind, we segment the work of the thesis in two parts: the radio access network (RAN) with a focus on Massive MIMO (M-MIMO) technology and the E2E connection from a point of view of the transport layer.In the first part, we consider hierarchical beamforming in wireless networks. For a given population of flows, we propose computationally efficient algorithms for fair rate allocation. We next propose closed-form formulas for flow level performance, for both elastic (with either proportional fairness and max-min fairness) and streaming traffic. We further assess the performance of hierarchical beamforming using numerical experiments.In the second part, we identify an application of SON namely the control of the starvation probability of video streaming service. The buffer receives data from a server with an E2E connection following the TCP protocol. We propose a model that describes the behavior of a buffer content and we compare the analytical formulas obtained with simulations. Finally, we propose a SON function that by adjusting the application video rate, achieves a target starvation probability.
7

End-to-End Delay Performance Evaluation for VoIP in the LTE network

Masum, Md. Ebna, Babu, Md. Jewel January 2011 (has links)
Long Term Evolution (LTE) is the last step towards the 4th genera-tion of cellular networks. This revolution is necessitated by the un-ceasing increase in demand for high speed connection on LTE net-works. This thesis mainly focuses on performance evaluation of end-to end delay (E2E) for VoIP in the LTE networks. In the course of E2E performance evaluation, simulation approach is realized using simulation tool OPNET 16.0. Three scenarios have been created. The first one is the baseline network while among other two, one consists of VoIP traffic solely and the other consisted of FTP along with VoIP. E2E delay has been measured for both scenarios in various cases under the varying mobility speed of the node. Furthermore, packet loss for two network scenarios has been studied and presented in the same cases as for E2E delay measurement. Comparative performance analysis of the two networks has been done by the simulation output graphs. In light of the result analysis, the performance quality of a VoIP network (with and without the presence of additional network traffic) in LTE has been determined and discussed. The default parameters in OPNET 16.0 for LTE have been used during simulation.
8

Nízko-dimenzionální faktorizace pro "End-To-End" řečové systémy / Low-Dimensional Matrix Factorization in End-To-End Speech Recognition Systems

Gajdár, Matúš January 2020 (has links)
The project covers automatic speech recognition with neural network training using low-dimensional matrix factorization. We are describing time delay neural networks with factorization (TDNN-F) and without it (TDNN) in Pytorch language. We are comparing the implementation between Pytorch and Kaldi toolkit, where we achieve similar results during experiments with various network architectures. The last chapter describes the impact of a low-dimensional matrix factorization on End-to-End speech recognition systems and also a modification of the system with TDNN(-F) networks. Using specific network settings, we were able to achieve better results with systems using factorization. Additionally, we reduced the complexity of training by decreasing network parameters with the use of TDNN(-F) networks.
9

Key management with a trusted third party using LoRaWAN protocol : A study case for E2E security

Ralambotiana, Miora January 2018 (has links)
Nowadays, Internet of Things (IoT) applications are gaining more importance in people’s everyday life. Depending of their usage (for long or short distance communications, using low or high power devices, etc.), several standards exist. In this study, the focus is on Low Power Wide Area Networks (LPWAN) and particularly a protocol which is raising in popularity for long-range low-power communications in IoT: LoRaWAN. LoRaWAN is still at an early stage and has been mainly used in use cases where the network server was managing the keys ensuring confidentiality and integrity of the data. Gemalto has raised the issue of interest conflicts in the case where the network operator and the application provider are two distinct entities: if the end-device and the application server are exchanging sensitive data, the network server should not be able to read them. In order to solve this problem, an architecture using a trusted third party to generate and manage the keys has been implemented during this project. The following research aims at finding security threats and weaknesses on the confidentiality and integrity of the data and devices’ authentication in this study case. The LoRaWAN protocol and key management in general were studied first before describing the studied system and finding the possible attacks exploring its vulnerabilities on the mentioned points via an attack tree. These attacks were simulated in order to define their consequences on the system and according to them, security improvements on the architecture was proposed based on previous work on the topic and exploration on potential countermeasures. / Idag blir Internet av saker (IoT) applikationer allt viktigare i människors vardag. Beroende på användningen (för långeller kortdistanskommunikation, med låga eller höga effektenheter etc.) finns flera standarder. I denna studie ligger fokus på Low Power Wide Area Networks (LPWAN) och i synnerhet ett protokoll som ökar i popularitet för långsiktig lågkapacitetskommunikation i IoT: LoRaWAN. LoRaWAN är fortfarande på ett tidigt stadium och har i huvudsak använts i användarfall där nätverksservern hanterade nycklarna som säkerställer konfidentialitet och integritet av data. Gemalto har tagit upp frågan om intressekonflikter i det fall nätverksoperatören och programleverantören är två separata enheter: Om slutanordningen och applikationsservern utbyter känslig data, ska nätverksservern inte kunna läsa dem. För att lösa detta problem har en arkitektur som använder en betrodd tredje part för att generera och hantera nycklarna implementerats under det här projektet. Följande forskning syftar till att hitta säkerhetshot och svagheter om konfidentialiteten och integriteten hos data och enheternas autentisering i detta studiefall. LoRaWAN-protokollet och nyckelhanteringen i allmänhet kommer att studeras först innan författaren beskriver det studerade systemet och upptäcker de eventuella attacker som undersöker sårbarheten på de nämnda punkterna via ett angreppsträd. Dessa attacker kommer att simuleras för att definiera deras konsekvenser på systemet och enligt dem kommer säkerhetsförbättringar på arkitekturen att föreslås utifrån tidigare arbete med ämnet och undersökning av potentiella motåtgärder
10

Testing Safety-Critical Systems using Fault Injection and Property-Based Testing

Vedder, Benjamin January 2015 (has links)
Testing software-intensive systems can be challenging, especially when safety requirements are involved. Property-Based Testing (PBT) is a software testing technique where properties about software are specified and thousands of test cases with a wide range of inputs are automatically generated based on these properties. PBT does not formally prove that the software fulfils its specification, but it is an efficient way to identify deviations from the specification. Safety-critical systems that must be able to deal with faults, without causing damage or injuries, are often tested using Fault Injection (FI) at several abstraction levels. The purpose of FI is to inject faults into a system in order to exercise and evaluate fault handling mechanisms. The aim of this thesis is to investigate how knowledge and techniques from the areas of FI and PBT can be used together to test functional and safety requirements simultaneously. We have developed a FI tool named FaultCheck that enables PBT tools to use common FI-techniques directly on source code. In order to evaluate and demonstrate our approach, we have applied our tool FaultCheck together with the commercially available PBT tool QuickCheck on a simple and on a complex system. The simple system is the AUTOSAR End-to-End (E2E) library and the complex system is a quadcopter simulator that we developed ourselves. The quadcopter simulator is based on a hardware quadcopter platform that we also developed, and the fault models that we inject into the simulator using FaultCheck are derived from the hardware quadcopter platform. We were able to efficiently apply FaultCheck together with QuickCheck on both the E2E library and the quadcopter simulator, which gives us confidence that FI together with PBT can be used to test and evaluate a wide range of simple and complex safety-critical software. / <p>This research has been funded through the PROWESS EU project (Grant agreement no: 317820), the KARYON EU project (Grant agreement no: 288195) and through EISIGS (grants from the Knowledge Foundation).</p> / PROWESS / KARYON

Page generated in 0.0385 seconds