111 |
Detekce časových postranních kanálů v TLS / Detection of Timing Side-Channels in TLSKoscielniak, Jan January 2020 (has links)
Protokol TLS je komplexní a jeho použití je široce rozšířené. Mnoho zařízení používá TLS na ustanovení bezpečné komunikace, vzniká tak potřeba tento protokol důkladně testovat. Tato diplomová práce se zaměřuje na útoky přes časové postranní kanály, které se znovu a znovu objevují jako variace na už známé útoky. Práce si klade za cíl usnadnit korektní odstranění těchto postranních kanálů a předcházet vzniku nových vytvořením automatizovaného frameworku, který pak bude integrován do nástroje tlsfuzzer, a vytvořením testovacích scénářů pro známé útoky postranními kanály. Vytvořené rozšíření využívá program tcpdump pro sběr časových údajů a statistické testy spolu s podpůrnými grafy k rozhodnutí, zda se jedná o možný postranní kanál. Rozšíření bylo zhodnoceno pomocí nových testovacích skriptů a byla předvedena jeho dobrá schopnost rozlišit postranní kanál. Rozšíření spolu s testy je nyní součástí nástroje tlsfuzzer.
|
112 |
Zabezpečený peer to peer komunikační systém / Secure peer-to-peer communication systemEliáš, Luboš January 2008 (has links)
The main aim of this master's thesis is to implement a common, secure and peer-to-peer communication system. The system has ability to automatically establish and run a secure end-to-end connection. It has this ability even if a network address translator is in the way to the destination system, without need of any explicit configuration of this translator. The security procedures of this system are in a transparent manner masked from individual applications, which had to solve this challenge in their own way. A responsibility for a security is delegate to an application-independent subsystem working within the core of an operating system. The security of this subsystem is based on capturing the outbound and inbound IP packets and their authentication and encryption. The system was successfully implemented in MS Windows XP operating system, in programming language C++. Transfer rate of communication tunnel in different network bandwidth speeds was measured. Result shows, that in the case of use the system on standard PC sold nowadays is practically no decrease of the transfer rate in comparison to a common channel.
|
113 |
Metody zajištění IP PBX proti útokům / Securing IP PBX against attacksHynek, Luboš January 2013 (has links)
This master project focuses on the possibilities of protecting the most common free software PBX Asterisk, FreeSWITCH and YATE. In practice, it was verified the behavior of PBX in the attacks and suggested protection against them on one of the most popular distributions of Linux server on CentOS. Tool was created to simulate several types of attacks targeting denial of service. Both protective options PBX themselves and operating system capabilities are used in this work. Comparison was also the possibility of protection of individual PBX with each other. It also includes a brief description of the protocol, topology attacks and recommendation for the operation of softswitches.
|
114 |
Alternative Approaches for the Registration of Terrestrial Laser Scanners Data using Linear/Planar FeaturesDewen Shi (9731966) 15 December 2020 (has links)
<p>Static terrestrial laser scanners have been increasingly used in three-dimensional data acquisition since it can rapidly provide accurate measurements with high resolution. Several scans from multiple viewpoints are necessary to achieve complete coverage of the surveyed objects due to occlusion and large object size. Therefore, in order to reconstruct three-dimensional models of the objects, the task of registration is required to transform several individual scans into a common reference frame. This thesis introduces three alternative approaches for the coarse registration of two adjacent scans, namely, feature-based approach, pseudo-conjugate point-based method, and closed-form solution. In the feature-based approach, linear and planar features in the overlapping area of adjacent scans are selected as registration primitives. The pseudo-conjugate point-based method utilizes non-corresponding points along common linear and planar features to estimate transformation parameters. The pseudo-conjugate point-based method is simpler than the feature-based approach since the partial derivatives are easier to compute. In the closed-form solution, a rotation matrix is first estimated by using a unit quaternion, which is a concise description of the rotation. Afterward, the translation parameters are estimated with non-corresponding points along the linear or planar features by using the pseudo-conjugate point-based method. Alternative approaches for fitting a line or plane to data with errors in three-dimensional space are investigated.</p><p><br></p><p>Experiments are conducted using simulated and real datasets to verify the effectiveness of the introduced registration procedures and feature fitting approaches. The proposed two approaches of line fitting are tested with simulated datasets. The results suggest that these two approaches can produce identical line parameters and variance-covariance matrix. The three registration approaches are tested with both simulated and real datasets. In the simulated datasets, all three registration approaches produced equivalent transformation parameters using linear or planar features. The comparison between the simulated linear and planar features shows that both features can produce equivalent registration results. In the real datasets, the three registration approaches using the linear or planar features also produced equivalent results. In addition, the results using real data indicates that the registration approaches using planar features produced better results than the approaches using linear features. The experiments show that the pseudo-conjugate point-based approach is easier to implement than the feature-based approach. The pseudo-conjugate point-based method and feature-based approach are nonlinear, so an initial guess of transformation parameters is required in these two approaches. Compared to the nonlinear approaches, the closed-form solution is linear and hence it can achieve the registration of two adjacent scans without the requirement of any initial guess for transformation parameters. Therefore, the pseudo-conjugate point-based method and closed-form solution are the preferred approaches for coarse registration using linear or planar features. In real practice, the planar features would have a better preference when compared to linear features since the linear features are derived indirectly by the intersection of neighboring planar features. To get enough lines with different orientations, planes that are far apart from each other have to be extrapolated to derive lines.</p><div><br></div>
|
115 |
Web site security maturity of the European Union and its member states : A survey study on the compliance with best practices of DNSSEC, HSTS, HTTPS, TLS-version, and certificate validation typesRapp, Axel January 2021 (has links)
With e-governance steadily growing, citizen-to-state communication via Web sites is as well, placing enormous trust in the protocols designed to handle this communication in a secure manner. Since breaching any of the protocols enabling Web site communication could yield benefits to a malicious attacker and bring harm to end-users, the battle between hackers and information security professionals is ongoing and never-ending. This phenomenon is the main reason why it is of importance to adhere to the latest best practices established by specialized independent organizations. Best practice compliance is important for any organization, but maybe most of all for our governing authorities, which we should hold to the highest standard possible due to the nature of their societal responsibility to protect the public. This report aims to, by conducting a quantitative survey, study the Web sites of the governments and government agencies of the member states of the European Union, as well as Web sites controlled by the European Union to assess to what degree their domains comply with the current best practices of DNSSEC, HSTS, HTTPS, SSL/TLS, and certificate validation types. The findings presented in this paper show that there are significant differences in compliance level between the different parameters measured, where HTTPS best practice deployment was the highest (96%) and HSTS best practice deployment was the lowest (3%). Further, when comparing the average best practice compliance by country, Denmark and the Netherlands performed the best, while Cyprus had the lowest average.
|
116 |
Relay Racing with X.509 Mayflies : An Analysis of Certificate Replacements and Validity Periods in HTTPS Certificate Logs / Stafettlöpning med X.509-dagsländor : En Analys av Certifikatutbyten och Giltighetsperioder i HTTPS-certifikatloggarBruhner, Carl Magnus, Linnarsson, Oscar January 2020 (has links)
Certificates are the foundation of secure communication over the internet as of today. While certificates can be issued with long validity periods, there is always a risk of having them compromised during their lifetime. A good practice is therefore to use shorter validity periods. However, this limits the certificate lifetime and gives less flexibility in the timing of certificate replacements. In this thesis, we use publicly available network logs from Rapid7's Project Sonar to provide an overview of the current state of certificate usage behavior. Specifically, we look at the Let's Encrypt mass revocation event in March 2020, where millions of certificates were revoked with just five days notice. In general, we show how this kind of datasets can be used, and as a deeper exploration we analyze certificate validity, lifetime and use of certificates with overlapping validity periods, as well as discuss how our findings relate to industry standard and current security trends. Specifically, we isolate automated certificate services such as Let's Encrypt and cPanel to see how their certificates differ in characteristics from other certificates in general. Based on our findings, we propose a set of rules to help improve the trust in certificate usage and strengthen security online, introducing an Always secure policy aligning certificate validity with revocation time limits in order to replace revocation requirements and overcoming the fact that mobile devices today ignore this very important security feature. To round things off, we provide some ideas for further research based on our findings and what we see possible with datasets such as the one researched in this thesis.
|
117 |
Sécurité pour les réseaux du futur : gestion sécurisée des identités / Security for future networks : secure identity managementAissaoui Mehrez, Hassane 10 July 2015 (has links)
Aujourd'hui, l'Internet change radicalement nos habitudes, avec l'arrivée massive du nomadisme, l'internet des objets, l'utilisation croissante de l'informatique en grille, les services Web, les réseaux sociaux et l'émergence de nouvelles approches dans ces dernières années. La virtualisation des infrastructures informatiques et le Cloud Computing ont particulièrement, permis de définir des nouveaux paradigmes, appelés X as a Service (XaaS), introduisant ainsi une rupture assez franche avec les modèles traditionnels, qui sont perçus comme une étape préparatoire vers l'Internet du Futur. En effet, la mise en œuvre de ces paradigmes, permet de mutualiser et de réorganiser le système informatique de manière différente, de dématérialiser les infrastructures physiques, de déporter les systèmes ou les applications sur des conteneurs virtuels distants. Par conséquent, l'architecture globale de l'Internet doit évoluer, en s'appuyant fortement sur ces nouvelles approches, en particulier, le Cloud Computing et la virtualisation. Malheureusement, comme toute technologie nouvelle, elle crée de nouveaux risques, qui viennent se greffer aux problèmes traditionnels : la séparation des privilèges, la gestion des accès, la gestion de l'identité, les failles des logiciels de virtualisation, l'isolation des machines virtuelles (VM), la protection des données personnelles, la vie privée, la réversibilité pendant l'externalisation (Outsourcing), etc. Les services basés sur les Clouds requièrent des fonctions de collaboration inter-fonctionnelles sécurisées ainsi que des systèmes de protection contre l'utilisation abusive des ressources. Ces systèmes doivent être équilibrés de façon raisonnable avec les besoins de confidentialité, d’intégrité, de protection de la vie privée des utilisateurs. Ils doivent permettre l’authentification des utilisateurs sans révéler des informations sur leur identité. Ainsi, une offre de services personnalisés aux clients dans un environnement virtuel et/ou transorganisationnel, en utilisant des mécanismes de sécurité adaptés à des infrastructures traditionnelles, peut prendre une dimension très complexe dans le modèle Cloud Computing, et peut constituer des défis à soulever pour les fournisseurs de ces services. Parmi ces défis à résoudre, la gestion d’identités des ressources, qui constitue un élément crucial pour authentifier les services à consommer, minimiser le risque d’accès frauduleux à des données personnelles, qui peut conduire à des conséquences désastreuses pour une entreprise ou un client. Les solutions existantes sont insuffisantes pour répondre aux défis soulevés par ces nouvelles approches. La mise en œuvre de ces modèles et ces outils posent des défis sécuritaires à la fois d’ordre organisationnel, architectural et protocolaire, pour garantir à chaque client des niveaux de sécurité. Ces niveaux doivent être identifiés pour guider les choix architecturaux et techniques à prendre, pour répondre en particulier aux exigences (LoA : Level of Assurance) et (LoT : Level of Trust), qu’un fournisseur de Cloud doit mettre en place pour garantir et protéger ses ressources. En effet, ces verrous et ces défis sécuritaires vont être relevés dans ce travail de recherche qui se situe dans le cadre du projet sécurité pour les réseaux du futur (SecFuNet : Security for Future Networks). C’est un projet collaboratif entre l’Europe et le Brésil, qui implique neuf partenaires européens répartis sur (la France, la Pologne, l'Allemagne et le Portugal) et 7 partenaires académiques brésiliens. Ce projet a pour ambition de proposer une nouvelle infrastructure de sécurité générale pour la communication des informations des utilisateurs sur Internet. L’objectif principal est de concevoir et développer une nouvelle architecture de sécurité cohérente pour les réseaux virtuels. / Today, the Internet is changing radically our habits, especially with the massive influx of the nomadic techniques, the Internet of objects, the growing use of grid computing, wireless networks and the emergence of new approaches in recent years. In particular, the virtualization of the computing infrastructures, which allowed defining a new model called Cloud Computing, introducing an enough frank breakdown with the traditional models, can be perceived as a preparatory stage towards the Internet of future.The implementation of these approaches allows, in a different way : mutualization and organization of the computer system. It allows to dematerialize the physical infrastructures and to deport applications on distant containers. Therefore, the global architecture of Internet should be evolved. It will rely strongly on these new approaches and in particular, Cloud Computing and virtualization. However, no system is infallible especially if resources are distributed and mutualized. They raise a number of problems and involve directly security issues, which remain one of the main barriers to the adoption of these technologies.Like any new technology, Cloud Computing and virtualization create new risks, which come to graft to traditional threats of the outsourcing management of the privilege separation, the identity and accesses management, the robustness of the virtualization software, the virtual machine isolation, the personal data protection, reversibility, privacy... The traditional Internet architecture cannot provide the adequate solutions to the challenges raised by these new approaches: mobility, flexibility, security requirements, reliability and robustness. Thus, a research project (SecFuNet : Security For Future Networks) was validated by the European Commission, to provide some answers, to make a state of the art of these security mechanisms and a comprehensive study of orchestration and integration techniques based on protection components within overall security architecture.
|
118 |
Bezpečné kryptografické algoritmy / Safe Cryptography AlgorithmsZbránek, Lukáš January 2008 (has links)
In this thesis there is description of cryptographic algorithms. Their properties are being compared, weak and strong points and right usage of particular algorithms. The main topics are safeness of algorithms, their bugs and improvements and difficulty of breaching. As a complement to ciphers there are also hash functions taken in consideration. There are also showed the most common methods of cryptanalysis. As a practical application of described algorithms I analyze systems for secure data transfer SSH and SSL/TLS and demonstrate an attack on SSL connection. In conclusion there is recommendation of safe algorithms for further usage and safe parameters of SSH and SSL/TLS connections.
|
119 |
Investigation of above-ground biomass with terrestrial laser scanning : A case study of Valls Hage in GävleBillenberg, Mathias January 2023 (has links)
The thesis investigates above-ground biomass (AGB) with terrestrial laser scanning (TLS) for estimating AGB in a study area in Valls Hage, Gävle. The study used TLS for field measurements to collect highly detailed point clouds of two tree species for AGB estimation and comparison against validation data. TLS-derived data were validated using a non-destructive method involving direct field measurements using tape measures and a Trimble SX12 for extracting diameter at breast height (DBH), tree height, and crown diameter. Wood density was obtained from the literature. Data processing for segmentation, filtering, and generation of the quantitative structure model (QSM) was performed by using SimpleForest tool in Computree software. A statistical analysis was performed using linear regression, and AGB was estimated using QSM-derived volume multiplied by wood density. The finding in the results for the comparison of AGB estimation between TLS QSM and field validation from DBH-based tree-specific allometric equation had an RMSE of 154 kg, with a near-perfect agreement of 0.997 %, and RMSE of 189 kg, with the agreement of 0.990% for TLS QSM and TLS validation DBH-based tree specific equation. The comparison between TLS-derived DBH and field validation was accurate, leaving with insignificant differences, while the tree height had noticeable differences, and crown diameter had relatively low differences. The challenges during data processing were highlighted and the importance of TLS data for accurate AGB estimation, with the potential for refinement and integrating internal tree structure information to improve allometric models for future studies.
|
120 |
Longitudinal analysis of the certificate chains of big tech company domains / Longitudinell analys av certifikatkedjor till domäner tillhörande stora teknikföretagKlasson, Sebastian, Lindström, Nina January 2021 (has links)
The internet is one of the most widely used mediums for communication in modern society and it has become an everyday necessity for many. It is therefore of utmost importance that it remains as secure as possible. SSL and TLS are the backbones of internet security and an integral part of these technologies are the certificates used. Certificate authorities (CAs) can issue certificates that validate that domains are who they claim to be. If a user trusts a CA they can in turn also trust domains that have been validated by them. CAs can in turn trust other CAs and this, in turn, creates a chain of trust called a certificate chain. In this thesis, the structure of these certificate chains is analysed and a longitudinal dataset is created. The analysis looks at how the certificate chains have changed over time and puts extra focus on the domains of big tech companies. The dataset created can also be used for further analysis in the future and will be a useful tool in the examination of historical certificate chains. Our findings show that the certificate chains of the domains studied do change over time; both their structure and the lengths of them vary noticeably. Most of the observed domains show a decrease in average chain length between the years of 2013 and 2020 and the structure of the chains vary significantly over the years.
|
Page generated in 0.0176 seconds